Dec 05 20:10:28 crc systemd[1]: Starting Kubernetes Kubelet... Dec 05 20:10:28 crc restorecon[4690]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:28 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:10:29 crc restorecon[4690]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 20:10:29 crc restorecon[4690]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 05 20:10:29 crc kubenswrapper[4744]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:10:29 crc kubenswrapper[4744]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 05 20:10:29 crc kubenswrapper[4744]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:10:29 crc kubenswrapper[4744]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:10:29 crc kubenswrapper[4744]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 05 20:10:29 crc kubenswrapper[4744]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.881941 4744 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888372 4744 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888414 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888423 4744 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888432 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888441 4744 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888449 4744 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888459 4744 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888468 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888476 4744 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888484 4744 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888491 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888500 4744 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888508 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888516 4744 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888524 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888532 4744 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888540 4744 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888548 4744 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888555 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888563 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888571 4744 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888579 4744 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888587 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888595 4744 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888602 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888610 4744 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888617 4744 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888625 4744 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888636 4744 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888647 4744 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888656 4744 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888664 4744 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888674 4744 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888684 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888707 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888716 4744 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888724 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888732 4744 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888740 4744 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888748 4744 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888756 4744 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888764 4744 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888771 4744 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888779 4744 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888787 4744 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888796 4744 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888806 4744 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888814 4744 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888821 4744 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888829 4744 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888837 4744 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888844 4744 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888855 4744 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888864 4744 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888873 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888883 4744 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888893 4744 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888902 4744 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888910 4744 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888919 4744 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888926 4744 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888935 4744 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888944 4744 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888951 4744 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888964 4744 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888972 4744 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888982 4744 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888990 4744 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.888998 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.889005 4744 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.889013 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889232 4744 flags.go:64] FLAG: --address="0.0.0.0" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889252 4744 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889266 4744 flags.go:64] FLAG: --anonymous-auth="true" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889278 4744 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889313 4744 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889339 4744 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889351 4744 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889362 4744 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889370 4744 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889379 4744 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889389 4744 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889400 4744 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889409 4744 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889418 4744 flags.go:64] FLAG: --cgroup-root="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889427 4744 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889436 4744 flags.go:64] FLAG: --client-ca-file="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889444 4744 flags.go:64] FLAG: --cloud-config="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889453 4744 flags.go:64] FLAG: --cloud-provider="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889462 4744 flags.go:64] FLAG: --cluster-dns="[]" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889473 4744 flags.go:64] FLAG: --cluster-domain="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889482 4744 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889491 4744 flags.go:64] FLAG: --config-dir="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889500 4744 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889509 4744 flags.go:64] FLAG: --container-log-max-files="5" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889520 4744 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889529 4744 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889538 4744 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889548 4744 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889557 4744 flags.go:64] FLAG: --contention-profiling="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889566 4744 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889575 4744 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889584 4744 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889593 4744 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889605 4744 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889614 4744 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889623 4744 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889632 4744 flags.go:64] FLAG: --enable-load-reader="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889640 4744 flags.go:64] FLAG: --enable-server="true" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889649 4744 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889660 4744 flags.go:64] FLAG: --event-burst="100" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889669 4744 flags.go:64] FLAG: --event-qps="50" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889678 4744 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889687 4744 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889696 4744 flags.go:64] FLAG: --eviction-hard="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889706 4744 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889715 4744 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889724 4744 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889734 4744 flags.go:64] FLAG: --eviction-soft="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889743 4744 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889752 4744 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889760 4744 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889769 4744 flags.go:64] FLAG: --experimental-mounter-path="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889778 4744 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889787 4744 flags.go:64] FLAG: --fail-swap-on="true" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889796 4744 flags.go:64] FLAG: --feature-gates="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889806 4744 flags.go:64] FLAG: --file-check-frequency="20s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889816 4744 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889825 4744 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889834 4744 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889843 4744 flags.go:64] FLAG: --healthz-port="10248" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889852 4744 flags.go:64] FLAG: --help="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889861 4744 flags.go:64] FLAG: --hostname-override="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889871 4744 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889882 4744 flags.go:64] FLAG: --http-check-frequency="20s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889893 4744 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889903 4744 flags.go:64] FLAG: --image-credential-provider-config="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889912 4744 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889921 4744 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889929 4744 flags.go:64] FLAG: --image-service-endpoint="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889939 4744 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889947 4744 flags.go:64] FLAG: --kube-api-burst="100" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889956 4744 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889965 4744 flags.go:64] FLAG: --kube-api-qps="50" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889974 4744 flags.go:64] FLAG: --kube-reserved="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889983 4744 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.889992 4744 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890001 4744 flags.go:64] FLAG: --kubelet-cgroups="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890009 4744 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890019 4744 flags.go:64] FLAG: --lock-file="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890028 4744 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890036 4744 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890045 4744 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890068 4744 flags.go:64] FLAG: --log-json-split-stream="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890078 4744 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890087 4744 flags.go:64] FLAG: --log-text-split-stream="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890096 4744 flags.go:64] FLAG: --logging-format="text" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890105 4744 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890114 4744 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890123 4744 flags.go:64] FLAG: --manifest-url="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890132 4744 flags.go:64] FLAG: --manifest-url-header="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890144 4744 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890154 4744 flags.go:64] FLAG: --max-open-files="1000000" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890165 4744 flags.go:64] FLAG: --max-pods="110" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890174 4744 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890219 4744 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890229 4744 flags.go:64] FLAG: --memory-manager-policy="None" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890238 4744 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890248 4744 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890257 4744 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890266 4744 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890286 4744 flags.go:64] FLAG: --node-status-max-images="50" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890317 4744 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890327 4744 flags.go:64] FLAG: --oom-score-adj="-999" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890336 4744 flags.go:64] FLAG: --pod-cidr="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890345 4744 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890360 4744 flags.go:64] FLAG: --pod-manifest-path="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890368 4744 flags.go:64] FLAG: --pod-max-pids="-1" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890378 4744 flags.go:64] FLAG: --pods-per-core="0" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890388 4744 flags.go:64] FLAG: --port="10250" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890397 4744 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890406 4744 flags.go:64] FLAG: --provider-id="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890415 4744 flags.go:64] FLAG: --qos-reserved="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890424 4744 flags.go:64] FLAG: --read-only-port="10255" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890433 4744 flags.go:64] FLAG: --register-node="true" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890442 4744 flags.go:64] FLAG: --register-schedulable="true" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890451 4744 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890466 4744 flags.go:64] FLAG: --registry-burst="10" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890475 4744 flags.go:64] FLAG: --registry-qps="5" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890483 4744 flags.go:64] FLAG: --reserved-cpus="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890493 4744 flags.go:64] FLAG: --reserved-memory="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890505 4744 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890514 4744 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890523 4744 flags.go:64] FLAG: --rotate-certificates="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890533 4744 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890541 4744 flags.go:64] FLAG: --runonce="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890550 4744 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890560 4744 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890569 4744 flags.go:64] FLAG: --seccomp-default="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890578 4744 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890586 4744 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890596 4744 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890605 4744 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890614 4744 flags.go:64] FLAG: --storage-driver-password="root" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890623 4744 flags.go:64] FLAG: --storage-driver-secure="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890632 4744 flags.go:64] FLAG: --storage-driver-table="stats" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890641 4744 flags.go:64] FLAG: --storage-driver-user="root" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890649 4744 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890659 4744 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890668 4744 flags.go:64] FLAG: --system-cgroups="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890676 4744 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890690 4744 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890699 4744 flags.go:64] FLAG: --tls-cert-file="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890707 4744 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890718 4744 flags.go:64] FLAG: --tls-min-version="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890727 4744 flags.go:64] FLAG: --tls-private-key-file="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890736 4744 flags.go:64] FLAG: --topology-manager-policy="none" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890744 4744 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890753 4744 flags.go:64] FLAG: --topology-manager-scope="container" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890762 4744 flags.go:64] FLAG: --v="2" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890774 4744 flags.go:64] FLAG: --version="false" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890785 4744 flags.go:64] FLAG: --vmodule="" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890796 4744 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.890805 4744 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891015 4744 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891026 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891037 4744 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891047 4744 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891058 4744 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891067 4744 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891075 4744 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891083 4744 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891090 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891099 4744 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891107 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891114 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891122 4744 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891130 4744 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891138 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891146 4744 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891154 4744 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891162 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891170 4744 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891178 4744 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891186 4744 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891194 4744 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891202 4744 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891210 4744 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891221 4744 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891232 4744 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891241 4744 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891257 4744 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891268 4744 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891277 4744 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891286 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891327 4744 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891336 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891344 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891352 4744 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891360 4744 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891368 4744 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891376 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891385 4744 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891393 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891401 4744 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891413 4744 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891420 4744 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891428 4744 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891436 4744 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891445 4744 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891453 4744 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891461 4744 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891471 4744 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891481 4744 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891490 4744 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891498 4744 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891508 4744 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891517 4744 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891525 4744 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891533 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891541 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891549 4744 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891558 4744 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891569 4744 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891577 4744 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891585 4744 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891594 4744 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891602 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891610 4744 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891618 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891626 4744 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891634 4744 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891643 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891650 4744 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.891658 4744 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.891670 4744 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.902631 4744 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.902675 4744 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902838 4744 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902851 4744 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902861 4744 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902870 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902878 4744 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902886 4744 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902894 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902902 4744 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902910 4744 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902918 4744 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902925 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902933 4744 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902941 4744 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902948 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902956 4744 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902964 4744 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902972 4744 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902980 4744 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902988 4744 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.902995 4744 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903004 4744 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903011 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903019 4744 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903027 4744 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903034 4744 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903042 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903050 4744 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903058 4744 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903069 4744 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903081 4744 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903090 4744 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903099 4744 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903108 4744 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903116 4744 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903126 4744 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903134 4744 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903142 4744 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903150 4744 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903160 4744 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903171 4744 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903180 4744 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903188 4744 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903197 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903207 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903215 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903224 4744 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903233 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903241 4744 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903249 4744 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903258 4744 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903266 4744 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903273 4744 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903281 4744 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903313 4744 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903322 4744 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903330 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903338 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903345 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903355 4744 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903365 4744 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903374 4744 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903382 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903390 4744 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903401 4744 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903412 4744 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903422 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903430 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903437 4744 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903445 4744 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903453 4744 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903462 4744 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.903475 4744 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903698 4744 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903713 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903722 4744 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903730 4744 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903738 4744 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903745 4744 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903753 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903761 4744 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903769 4744 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903777 4744 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903788 4744 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903799 4744 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903809 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903818 4744 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903826 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903836 4744 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903846 4744 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903856 4744 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903865 4744 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903874 4744 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903882 4744 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903890 4744 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903898 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903906 4744 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903914 4744 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903922 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903930 4744 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903938 4744 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903946 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903954 4744 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903961 4744 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903969 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903976 4744 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903984 4744 feature_gate.go:330] unrecognized feature gate: Example Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.903993 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904004 4744 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904014 4744 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904022 4744 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904031 4744 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904040 4744 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904048 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904056 4744 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904063 4744 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904071 4744 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904079 4744 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904087 4744 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904094 4744 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904102 4744 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904109 4744 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904118 4744 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904126 4744 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904133 4744 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904141 4744 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904149 4744 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904158 4744 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904167 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904177 4744 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904186 4744 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904193 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904201 4744 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904210 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904217 4744 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904225 4744 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904232 4744 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904240 4744 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904248 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904256 4744 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904263 4744 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904271 4744 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904279 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.904311 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.904325 4744 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.904666 4744 server.go:940] "Client rotation is on, will bootstrap in background" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.909196 4744 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.909365 4744 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.910131 4744 server.go:997] "Starting client certificate rotation" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.910161 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.910394 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-18 02:30:44.645853712 +0000 UTC Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.910500 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.917713 4744 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 20:10:29 crc kubenswrapper[4744]: E1205 20:10:29.919245 4744 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.921785 4744 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.933071 4744 log.go:25] "Validated CRI v1 runtime API" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.954936 4744 log.go:25] "Validated CRI v1 image API" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.957085 4744 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.961748 4744 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-05-20-06-17-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.961791 4744 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.987607 4744 manager.go:217] Machine: {Timestamp:2025-12-05 20:10:29.985530469 +0000 UTC m=+0.215341917 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:81235d50-4058-490a-b9b8-3ea7ecb9321c BootID:19d8c788-01c0-4af7-b075-d7b6a1f1aadc Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:7f:db:fa Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:7f:db:fa Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:5f:75:13 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:5d:5b:13 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:42:22:13 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:44:b4:60 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ca:39:ec:10:72:16 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ae:2a:a0:83:70:c0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.988004 4744 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.988173 4744 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.989457 4744 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.990044 4744 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.990119 4744 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.990682 4744 topology_manager.go:138] "Creating topology manager with none policy" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.990719 4744 container_manager_linux.go:303] "Creating device plugin manager" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.991165 4744 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.991240 4744 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.991835 4744 state_mem.go:36] "Initialized new in-memory state store" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.992033 4744 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.993151 4744 kubelet.go:418] "Attempting to sync node with API server" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.993202 4744 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.993256 4744 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.993321 4744 kubelet.go:324] "Adding apiserver pod source" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.993346 4744 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.995465 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Dec 05 20:10:29 crc kubenswrapper[4744]: W1205 20:10:29.995530 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Dec 05 20:10:29 crc kubenswrapper[4744]: E1205 20:10:29.995722 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.995791 4744 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 05 20:10:29 crc kubenswrapper[4744]: E1205 20:10:29.995806 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.996351 4744 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.997269 4744 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.998116 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.998157 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.998172 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.998189 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.998250 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.998268 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.998282 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.998342 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.998364 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.998380 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.998399 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.998415 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.998715 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 05 20:10:29 crc kubenswrapper[4744]: I1205 20:10:29.999373 4744 server.go:1280] "Started kubelet" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.000176 4744 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 05 20:10:30 crc systemd[1]: Started Kubernetes Kubelet. Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.000126 4744 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.001676 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.001631 4744 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.004128 4744 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.004196 4744 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.004208 4744 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 01:29:36.373354358 +0000 UTC Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.004346 4744 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.004385 4744 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 05 20:10:30 crc kubenswrapper[4744]: E1205 20:10:30.004418 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.004534 4744 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 05 20:10:30 crc kubenswrapper[4744]: W1205 20:10:30.005530 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Dec 05 20:10:30 crc kubenswrapper[4744]: E1205 20:10:30.005645 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:10:30 crc kubenswrapper[4744]: E1205 20:10:30.006517 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="200ms" Dec 05 20:10:30 crc kubenswrapper[4744]: E1205 20:10:30.005471 4744 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.51:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e6ab517f3c212 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 20:10:29.99928885 +0000 UTC m=+0.229100258,LastTimestamp:2025-12-05 20:10:29.99928885 +0000 UTC m=+0.229100258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.009893 4744 factory.go:153] Registering CRI-O factory Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.009941 4744 factory.go:221] Registration of the crio container factory successfully Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.010073 4744 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.010098 4744 factory.go:55] Registering systemd factory Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.010117 4744 factory.go:221] Registration of the systemd container factory successfully Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.010156 4744 factory.go:103] Registering Raw factory Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.010189 4744 manager.go:1196] Started watching for new ooms in manager Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.010511 4744 server.go:460] "Adding debug handlers to kubelet server" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.011931 4744 manager.go:319] Starting recovery of all containers Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022104 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022283 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022552 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022581 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022611 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022635 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022654 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022674 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022696 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022716 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022736 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022755 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022800 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022830 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022868 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022893 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022924 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022951 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.022981 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023007 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023082 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023112 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023138 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023163 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023190 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023214 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023246 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023344 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023379 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023404 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023431 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023456 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023484 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023510 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023537 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023566 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023591 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023616 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023640 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023666 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023694 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023719 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023747 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023777 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023803 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023829 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023853 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023879 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023906 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023934 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023963 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.023988 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024024 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024057 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024085 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024113 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024140 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024167 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024645 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024679 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024707 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024734 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024762 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024786 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024815 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024845 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024873 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024900 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024927 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024954 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.024980 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.025006 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.025031 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.025055 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.025081 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.025106 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.025131 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.025156 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.025185 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.025213 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.026712 4744 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.026768 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.026802 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.026832 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.026863 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.026889 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.026914 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.026940 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.026969 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.026994 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027022 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027049 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027073 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027096 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027124 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027149 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027178 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027204 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027229 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027254 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027279 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027344 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027373 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027399 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027424 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027462 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027494 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027525 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027556 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027584 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027613 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027641 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027670 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027696 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027725 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027752 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027779 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027804 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027830 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027857 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027883 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027910 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027935 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027962 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.027987 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028015 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028061 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028091 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028124 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028150 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028177 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028202 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028228 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028254 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028278 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028341 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028367 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028407 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028452 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028481 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028507 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028534 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028559 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028597 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028645 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028676 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028703 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028730 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028759 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028797 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028824 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028854 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028880 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028906 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028932 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028957 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.028981 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.029008 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.029035 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.029063 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.029094 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.029120 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.029148 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.029192 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.029227 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.029257 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.029284 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.030198 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.030266 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.030328 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.030360 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.030396 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.030431 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.030465 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032176 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032350 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032390 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032433 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032454 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032495 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032518 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032548 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032590 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032612 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032644 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032666 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032737 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032775 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032794 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.032983 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.033007 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.033128 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.033154 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.033180 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.033313 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.033344 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.033730 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.033766 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.033786 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.033805 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.033837 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.033857 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.033952 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.033983 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.034004 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.034031 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.035878 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.035904 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.035922 4744 reconstruct.go:97] "Volume reconstruction finished" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.035934 4744 reconciler.go:26] "Reconciler: start to sync state" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.044490 4744 manager.go:324] Recovery completed Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.055252 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.056817 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.056991 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.057023 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.058109 4744 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.058130 4744 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.058155 4744 state_mem.go:36] "Initialized new in-memory state store" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.070407 4744 policy_none.go:49] "None policy: Start" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.072006 4744 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.072072 4744 state_mem.go:35] "Initializing new in-memory state store" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.076394 4744 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.079342 4744 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.079383 4744 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.079409 4744 kubelet.go:2335] "Starting kubelet main sync loop" Dec 05 20:10:30 crc kubenswrapper[4744]: E1205 20:10:30.079503 4744 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 05 20:10:30 crc kubenswrapper[4744]: W1205 20:10:30.087149 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Dec 05 20:10:30 crc kubenswrapper[4744]: E1205 20:10:30.087231 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:10:30 crc kubenswrapper[4744]: E1205 20:10:30.105180 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.135801 4744 manager.go:334] "Starting Device Plugin manager" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.135857 4744 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.135872 4744 server.go:79] "Starting device plugin registration server" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.136360 4744 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.136375 4744 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.136776 4744 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.136883 4744 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.136893 4744 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 05 20:10:30 crc kubenswrapper[4744]: E1205 20:10:30.145195 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.180310 4744 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.180396 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.181166 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.181191 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.181199 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.181339 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.181705 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.181749 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.182041 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.182080 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.182113 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.182208 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.182494 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.182561 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.182901 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.182946 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.182963 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.183099 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.183155 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.183167 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.183176 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.183571 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.183640 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.183748 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.183761 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.183769 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.183832 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.184263 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.184301 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.184516 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.184546 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.184556 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.185145 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.185171 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.185176 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.185203 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.185223 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.185175 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.185275 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.185284 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.185183 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.185693 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.185745 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.186835 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.186886 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.186905 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:30 crc kubenswrapper[4744]: E1205 20:10:30.207971 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="400ms" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.236592 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.237584 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.237653 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.237713 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.237765 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.237807 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.237852 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.237895 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.237929 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.237958 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.237985 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.238014 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.238041 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.238072 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.238091 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.238136 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.238153 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.238101 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.238189 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.238236 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: E1205 20:10:30.238934 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.51:6443: connect: connection refused" node="crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339445 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339496 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339532 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339564 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339649 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339679 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339690 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339755 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339721 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339665 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339818 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339761 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339710 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339933 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339960 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339988 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.340023 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.340048 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.340113 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.340150 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.340179 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.340181 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.340204 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.340223 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.340229 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.340263 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.340286 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.340343 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.340430 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.339905 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.439026 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.441105 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.441169 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.441186 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.441227 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:10:30 crc kubenswrapper[4744]: E1205 20:10:30.442071 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.51:6443: connect: connection refused" node="crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.518104 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.536640 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.549988 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: W1205 20:10:30.554243 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-18817fd973919e0cc442b12abfbcdeb1dea40d61476b64224b1ba6397148b20c WatchSource:0}: Error finding container 18817fd973919e0cc442b12abfbcdeb1dea40d61476b64224b1ba6397148b20c: Status 404 returned error can't find the container with id 18817fd973919e0cc442b12abfbcdeb1dea40d61476b64224b1ba6397148b20c Dec 05 20:10:30 crc kubenswrapper[4744]: W1205 20:10:30.567173 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-5b92da323c4320a0756078d9576c0a7c8ea772ba56d3c3001c5e366a89c6dafc WatchSource:0}: Error finding container 5b92da323c4320a0756078d9576c0a7c8ea772ba56d3c3001c5e366a89c6dafc: Status 404 returned error can't find the container with id 5b92da323c4320a0756078d9576c0a7c8ea772ba56d3c3001c5e366a89c6dafc Dec 05 20:10:30 crc kubenswrapper[4744]: W1205 20:10:30.568494 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-ac883cc782a5b9dac5e1a2ecfe4c9f603565e6ef3c41af40162b6628e342fed8 WatchSource:0}: Error finding container ac883cc782a5b9dac5e1a2ecfe4c9f603565e6ef3c41af40162b6628e342fed8: Status 404 returned error can't find the container with id ac883cc782a5b9dac5e1a2ecfe4c9f603565e6ef3c41af40162b6628e342fed8 Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.570147 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.582150 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:10:30 crc kubenswrapper[4744]: W1205 20:10:30.592385 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-f8bc062daa09b4c9c92a0b27c23bfcaf9dcc5339a3598b55db9fe241bbeae28a WatchSource:0}: Error finding container f8bc062daa09b4c9c92a0b27c23bfcaf9dcc5339a3598b55db9fe241bbeae28a: Status 404 returned error can't find the container with id f8bc062daa09b4c9c92a0b27c23bfcaf9dcc5339a3598b55db9fe241bbeae28a Dec 05 20:10:30 crc kubenswrapper[4744]: W1205 20:10:30.607938 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-997aef6557c007a07b339407885cddf9bc878fe059be5a76f157053b22b42a38 WatchSource:0}: Error finding container 997aef6557c007a07b339407885cddf9bc878fe059be5a76f157053b22b42a38: Status 404 returned error can't find the container with id 997aef6557c007a07b339407885cddf9bc878fe059be5a76f157053b22b42a38 Dec 05 20:10:30 crc kubenswrapper[4744]: E1205 20:10:30.609209 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="800ms" Dec 05 20:10:30 crc kubenswrapper[4744]: W1205 20:10:30.821793 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Dec 05 20:10:30 crc kubenswrapper[4744]: E1205 20:10:30.821893 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.842204 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.843825 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.843885 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.843903 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:30 crc kubenswrapper[4744]: I1205 20:10:30.843942 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:10:30 crc kubenswrapper[4744]: E1205 20:10:30.844543 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.51:6443: connect: connection refused" node="crc" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.003541 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.004524 4744 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 08:24:00.762531624 +0000 UTC Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.004576 4744 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 996h13m29.757958473s for next certificate rotation Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.084997 4744 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4" exitCode=0 Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.085086 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4"} Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.085213 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"18817fd973919e0cc442b12abfbcdeb1dea40d61476b64224b1ba6397148b20c"} Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.085431 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.087560 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.087614 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.087633 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.088369 4744 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440" exitCode=0 Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.088449 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440"} Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.088492 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"997aef6557c007a07b339407885cddf9bc878fe059be5a76f157053b22b42a38"} Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.088593 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.089521 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.089555 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.089567 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.090565 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5"} Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.090619 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f8bc062daa09b4c9c92a0b27c23bfcaf9dcc5339a3598b55db9fe241bbeae28a"} Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.091962 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f" exitCode=0 Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.092021 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f"} Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.092042 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ac883cc782a5b9dac5e1a2ecfe4c9f603565e6ef3c41af40162b6628e342fed8"} Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.092120 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.092982 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.093014 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.093025 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.093762 4744 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111" exitCode=0 Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.093866 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111"} Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.094177 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5b92da323c4320a0756078d9576c0a7c8ea772ba56d3c3001c5e366a89c6dafc"} Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.094356 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.095346 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.095401 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.095424 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.098384 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.100540 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.100580 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.100652 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:31 crc kubenswrapper[4744]: W1205 20:10:31.115867 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Dec 05 20:10:31 crc kubenswrapper[4744]: E1205 20:10:31.115959 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:10:31 crc kubenswrapper[4744]: W1205 20:10:31.220965 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Dec 05 20:10:31 crc kubenswrapper[4744]: E1205 20:10:31.221064 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:10:31 crc kubenswrapper[4744]: E1205 20:10:31.409908 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="1.6s" Dec 05 20:10:31 crc kubenswrapper[4744]: W1205 20:10:31.483530 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Dec 05 20:10:31 crc kubenswrapper[4744]: E1205 20:10:31.483816 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.645056 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.646163 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.646194 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.646202 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:31 crc kubenswrapper[4744]: I1205 20:10:31.646228 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.073246 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.100682 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1"} Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.100735 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb"} Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.100749 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8"} Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.100764 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989"} Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.100781 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8"} Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.100915 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.102106 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.102149 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.102160 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.103052 4744 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322" exitCode=0 Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.103129 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322"} Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.103318 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.104193 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.104232 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.104244 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.105280 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1c2202785bde164d4a280e7869c6fcea433591d861b9ebceeb441f42e7a44552"} Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.105440 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.106564 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.106592 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.106605 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.117029 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"712fb551ba5ca5e933dcb56b5d5d89d892320c9e52da2d46da7e19133939ef79"} Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.117073 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3a563f5bbd35e353c4f1763fdc0d084cd4bc94f57fb048205dd02dcadbac4e74"} Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.117093 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a3cb0c6029e9d18a57a79c23494dfa0c9f0edb458067341b7edd7f172d15f49f"} Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.117240 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.119031 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.119066 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.119078 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.121575 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703"} Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.121605 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6"} Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.121618 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144"} Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.121699 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.122604 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.122630 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.122640 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.587460 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:32 crc kubenswrapper[4744]: I1205 20:10:32.874155 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.128867 4744 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798" exitCode=0 Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.128934 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798"} Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.129159 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.129218 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.129244 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.129325 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.129576 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.129648 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.131254 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.131361 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.131262 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.131363 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.131388 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.131443 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.131425 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.131469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.131522 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.132044 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.132104 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.132130 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.348076 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.356810 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:33 crc kubenswrapper[4744]: I1205 20:10:33.817264 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.138671 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4"} Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.138739 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.138774 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714"} Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.138812 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.138811 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a"} Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.138929 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.140648 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.140706 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.140726 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.140765 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.140811 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.140834 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.480916 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.481192 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.482713 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.482780 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.482798 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:34 crc kubenswrapper[4744]: I1205 20:10:34.986094 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:35 crc kubenswrapper[4744]: I1205 20:10:35.147339 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d"} Dec 05 20:10:35 crc kubenswrapper[4744]: I1205 20:10:35.147397 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:35 crc kubenswrapper[4744]: I1205 20:10:35.147409 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279"} Dec 05 20:10:35 crc kubenswrapper[4744]: I1205 20:10:35.147460 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:35 crc kubenswrapper[4744]: I1205 20:10:35.148930 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:35 crc kubenswrapper[4744]: I1205 20:10:35.148976 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:35 crc kubenswrapper[4744]: I1205 20:10:35.148995 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:35 crc kubenswrapper[4744]: I1205 20:10:35.149686 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:35 crc kubenswrapper[4744]: I1205 20:10:35.149743 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:35 crc kubenswrapper[4744]: I1205 20:10:35.149761 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:35 crc kubenswrapper[4744]: I1205 20:10:35.841625 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:35 crc kubenswrapper[4744]: I1205 20:10:35.841946 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:35 crc kubenswrapper[4744]: I1205 20:10:35.847646 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:35 crc kubenswrapper[4744]: I1205 20:10:35.847752 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:35 crc kubenswrapper[4744]: I1205 20:10:35.847771 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:36 crc kubenswrapper[4744]: I1205 20:10:36.149619 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:36 crc kubenswrapper[4744]: I1205 20:10:36.149634 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:36 crc kubenswrapper[4744]: I1205 20:10:36.151231 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:36 crc kubenswrapper[4744]: I1205 20:10:36.151355 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:36 crc kubenswrapper[4744]: I1205 20:10:36.151376 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:36 crc kubenswrapper[4744]: I1205 20:10:36.151900 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:36 crc kubenswrapper[4744]: I1205 20:10:36.151965 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:36 crc kubenswrapper[4744]: I1205 20:10:36.151989 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:37 crc kubenswrapper[4744]: I1205 20:10:37.986223 4744 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 20:10:37 crc kubenswrapper[4744]: I1205 20:10:37.986408 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 20:10:37 crc kubenswrapper[4744]: I1205 20:10:37.995632 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:37 crc kubenswrapper[4744]: I1205 20:10:37.995813 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:37 crc kubenswrapper[4744]: I1205 20:10:37.997561 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:37 crc kubenswrapper[4744]: I1205 20:10:37.997619 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:37 crc kubenswrapper[4744]: I1205 20:10:37.997647 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:38 crc kubenswrapper[4744]: I1205 20:10:38.404935 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 05 20:10:38 crc kubenswrapper[4744]: I1205 20:10:38.405187 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:38 crc kubenswrapper[4744]: I1205 20:10:38.407097 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:38 crc kubenswrapper[4744]: I1205 20:10:38.407148 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:38 crc kubenswrapper[4744]: I1205 20:10:38.407167 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:39 crc kubenswrapper[4744]: I1205 20:10:39.050702 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 05 20:10:39 crc kubenswrapper[4744]: I1205 20:10:39.190832 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:39 crc kubenswrapper[4744]: I1205 20:10:39.192622 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:39 crc kubenswrapper[4744]: I1205 20:10:39.192699 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:39 crc kubenswrapper[4744]: I1205 20:10:39.192717 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:40 crc kubenswrapper[4744]: E1205 20:10:40.145651 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 20:10:41 crc kubenswrapper[4744]: E1205 20:10:41.647867 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 05 20:10:42 crc kubenswrapper[4744]: I1205 20:10:42.010538 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 05 20:10:42 crc kubenswrapper[4744]: E1205 20:10:42.075607 4744 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 20:10:42 crc kubenswrapper[4744]: W1205 20:10:42.527662 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 20:10:42 crc kubenswrapper[4744]: I1205 20:10:42.527798 4744 trace.go:236] Trace[167092101]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:10:32.526) (total time: 10001ms): Dec 05 20:10:42 crc kubenswrapper[4744]: Trace[167092101]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:10:42.527) Dec 05 20:10:42 crc kubenswrapper[4744]: Trace[167092101]: [10.001347085s] [10.001347085s] END Dec 05 20:10:42 crc kubenswrapper[4744]: E1205 20:10:42.527839 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 20:10:42 crc kubenswrapper[4744]: I1205 20:10:42.878384 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:42 crc kubenswrapper[4744]: I1205 20:10:42.878485 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:42 crc kubenswrapper[4744]: I1205 20:10:42.879415 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:42 crc kubenswrapper[4744]: I1205 20:10:42.879442 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:42 crc kubenswrapper[4744]: I1205 20:10:42.879449 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:43 crc kubenswrapper[4744]: E1205 20:10:43.010860 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 05 20:10:43 crc kubenswrapper[4744]: I1205 20:10:43.248603 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:43 crc kubenswrapper[4744]: I1205 20:10:43.250107 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:43 crc kubenswrapper[4744]: I1205 20:10:43.250175 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:43 crc kubenswrapper[4744]: I1205 20:10:43.250201 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:43 crc kubenswrapper[4744]: I1205 20:10:43.250240 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:10:43 crc kubenswrapper[4744]: W1205 20:10:43.362935 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 20:10:43 crc kubenswrapper[4744]: I1205 20:10:43.363100 4744 trace.go:236] Trace[610835507]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:10:33.361) (total time: 10001ms): Dec 05 20:10:43 crc kubenswrapper[4744]: Trace[610835507]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:10:43.362) Dec 05 20:10:43 crc kubenswrapper[4744]: Trace[610835507]: [10.001730594s] [10.001730594s] END Dec 05 20:10:43 crc kubenswrapper[4744]: E1205 20:10:43.363157 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 20:10:43 crc kubenswrapper[4744]: W1205 20:10:43.502386 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 20:10:43 crc kubenswrapper[4744]: I1205 20:10:43.502500 4744 trace.go:236] Trace[538318182]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:10:33.500) (total time: 10002ms): Dec 05 20:10:43 crc kubenswrapper[4744]: Trace[538318182]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:10:43.502) Dec 05 20:10:43 crc kubenswrapper[4744]: Trace[538318182]: [10.002084164s] [10.002084164s] END Dec 05 20:10:43 crc kubenswrapper[4744]: E1205 20:10:43.502528 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 20:10:43 crc kubenswrapper[4744]: I1205 20:10:43.562765 4744 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 20:10:43 crc kubenswrapper[4744]: I1205 20:10:43.562844 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 20:10:43 crc kubenswrapper[4744]: I1205 20:10:43.578273 4744 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 20:10:43 crc kubenswrapper[4744]: I1205 20:10:43.578387 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 20:10:43 crc kubenswrapper[4744]: I1205 20:10:43.824076 4744 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]log ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]etcd ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/priority-and-fairness-filter ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/start-apiextensions-informers ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/start-apiextensions-controllers ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/crd-informer-synced ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/start-system-namespaces-controller ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 05 20:10:43 crc kubenswrapper[4744]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 05 20:10:43 crc kubenswrapper[4744]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/bootstrap-controller ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/start-kube-aggregator-informers ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/apiservice-registration-controller ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/apiservice-discovery-controller ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]autoregister-completion ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/apiservice-openapi-controller ok Dec 05 20:10:43 crc kubenswrapper[4744]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 05 20:10:43 crc kubenswrapper[4744]: livez check failed Dec 05 20:10:43 crc kubenswrapper[4744]: I1205 20:10:43.824159 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:10:46 crc kubenswrapper[4744]: I1205 20:10:46.394112 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 20:10:46 crc kubenswrapper[4744]: I1205 20:10:46.410488 4744 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 05 20:10:46 crc kubenswrapper[4744]: I1205 20:10:46.561433 4744 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.003488 4744 apiserver.go:52] "Watching apiserver" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.008250 4744 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.008689 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.009196 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.009260 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.009577 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.009620 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:47 crc kubenswrapper[4744]: E1205 20:10:47.009696 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.009713 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:10:47 crc kubenswrapper[4744]: E1205 20:10:47.009372 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.010468 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:10:47 crc kubenswrapper[4744]: E1205 20:10:47.010880 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.012247 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.012733 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.012856 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.013138 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.014882 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.014943 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.015269 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.015464 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.015741 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.055099 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.077540 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.095419 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.105966 4744 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.111320 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.127261 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.143201 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.147725 4744 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.159244 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.177088 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.987423 4744 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 20:10:47 crc kubenswrapper[4744]: I1205 20:10:47.987536 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.152692 4744 csr.go:261] certificate signing request csr-q8lhm is approved, waiting to be issued Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.163094 4744 csr.go:257] certificate signing request csr-q8lhm is issued Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.428037 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.443755 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.444267 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.446906 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.457270 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.469789 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.484530 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.499861 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.510069 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.520109 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.529963 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.540621 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.560326 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.572831 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.582441 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.589543 4744 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.591187 4744 trace.go:236] Trace[1042256276]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 20:10:33.988) (total time: 14602ms): Dec 05 20:10:48 crc kubenswrapper[4744]: Trace[1042256276]: ---"Objects listed" error: 14602ms (20:10:48.590) Dec 05 20:10:48 crc kubenswrapper[4744]: Trace[1042256276]: [14.602288523s] [14.602288523s] END Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.591234 4744 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 20:10:48 crc kubenswrapper[4744]: E1205 20:10:48.591399 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.592985 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.622753 4744 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:43554->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.622825 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:43554->192.168.126.11:17697: read: connection reset by peer" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.682565 4744 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.682625 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.689874 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.689913 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690022 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690045 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690065 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690084 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690104 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690124 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690145 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690166 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690188 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690210 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690233 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690248 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690258 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690320 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690353 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:10:48 crc kubenswrapper[4744]: E1205 20:10:48.690386 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:10:49.190367751 +0000 UTC m=+19.420179119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690387 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690421 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690458 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690485 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690510 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690532 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690554 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690564 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690574 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690595 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690596 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690618 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690644 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690646 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690667 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690689 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690711 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690721 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690733 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690754 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690770 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690775 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690821 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690844 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690860 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690878 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690895 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690911 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690931 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690951 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690968 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690983 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690998 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691036 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691054 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691071 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691088 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691143 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691162 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691177 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691194 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691209 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691224 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691239 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691253 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691303 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691320 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691335 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691351 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691366 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691382 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691397 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691413 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691431 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691472 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691499 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691519 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691535 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691550 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691566 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691583 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691599 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691616 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691633 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691648 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691663 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691677 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691695 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691712 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691728 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691772 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691787 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691803 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691818 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691834 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691850 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691869 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691883 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691899 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691916 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691931 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691946 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691961 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691977 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691993 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692009 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692024 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692042 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692062 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692078 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692093 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692107 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692124 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692140 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692155 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692202 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692219 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692235 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692269 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692300 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692319 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692334 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.690857 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692353 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691171 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691393 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691402 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691398 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691422 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691576 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692412 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691587 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691717 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.691740 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692020 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692024 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692164 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692177 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692494 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692181 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692341 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692338 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692592 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692683 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692716 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692798 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692829 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692949 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693003 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693119 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693151 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693153 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693178 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693316 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693445 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.692350 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693508 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693538 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693564 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693591 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693613 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693638 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693663 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693684 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693709 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693733 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693757 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693782 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693807 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693831 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693855 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693882 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693909 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693950 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693977 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694007 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694030 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694055 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694106 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694133 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694156 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694180 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694202 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694225 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694249 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694271 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694547 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694577 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694602 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694627 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694653 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694677 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694700 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694724 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694750 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694779 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694802 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694828 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694853 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694877 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694900 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694924 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694948 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694973 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695001 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695058 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695084 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695108 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695132 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695159 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695184 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695210 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695235 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695260 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695330 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695358 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695381 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695404 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695428 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695459 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695484 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695510 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695534 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695557 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695582 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695605 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695628 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695651 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695675 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695700 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695724 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695755 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695781 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695808 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695835 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695862 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695886 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695915 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695939 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695991 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696134 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696166 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696193 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696254 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696282 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696323 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696347 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696377 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696405 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696440 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696467 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696496 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696695 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696774 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696792 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696806 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696822 4744 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696836 4744 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696850 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696864 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696878 4744 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696891 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696905 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696919 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696933 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696947 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696962 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696975 4744 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696989 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697002 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697015 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697028 4744 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697042 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697055 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697068 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697081 4744 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697095 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697108 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697121 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697134 4744 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697148 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697162 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697175 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697188 4744 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697202 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697216 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697229 4744 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697244 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697258 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697273 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697305 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697320 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693463 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693607 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693699 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693752 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.693917 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694086 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694193 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694222 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694270 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694274 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694468 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694550 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694587 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694678 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694728 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694726 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694741 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.694921 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695079 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695133 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695328 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695489 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695536 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695545 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695752 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695782 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695822 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.695963 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696050 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696199 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696315 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696456 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696586 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696591 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.696724 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.697846 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.700687 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.700944 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.701105 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.701411 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.701587 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.701713 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.701892 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.702006 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.702332 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.702617 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.702786 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.702814 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.702865 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.702918 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.703266 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.703307 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.703374 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.703451 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.704148 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.704357 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.704555 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.704762 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.705017 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.705068 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.705247 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.705280 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.705682 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.705923 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.705962 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.706033 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.706008 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.706410 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.706411 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.706699 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.706717 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.706808 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.706844 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: E1205 20:10:48.706921 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.707713 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.708211 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.708233 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.708262 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.708452 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.708556 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.708738 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.709160 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.709726 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.710253 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.710499 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.710624 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: E1205 20:10:48.710926 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.710958 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.710995 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.711019 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: E1205 20:10:48.711107 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:49.211057222 +0000 UTC m=+19.440868590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.711701 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.711816 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.711956 4744 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.712103 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.712199 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.712454 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.712516 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.712549 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: E1205 20:10:48.712647 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:49.212634411 +0000 UTC m=+19.442445899 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.712686 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.712773 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.713168 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.713303 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.713942 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.714461 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.714557 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.714585 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.714919 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.715603 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.716420 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.716525 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.717260 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.717460 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.717493 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.717516 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.720488 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: E1205 20:10:48.721991 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:10:48 crc kubenswrapper[4744]: E1205 20:10:48.722026 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:10:48 crc kubenswrapper[4744]: E1205 20:10:48.722045 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:48 crc kubenswrapper[4744]: E1205 20:10:48.722120 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:49.222096365 +0000 UTC m=+19.451907843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:48 crc kubenswrapper[4744]: E1205 20:10:48.723222 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:10:48 crc kubenswrapper[4744]: E1205 20:10:48.723264 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:10:48 crc kubenswrapper[4744]: E1205 20:10:48.723282 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:48 crc kubenswrapper[4744]: E1205 20:10:48.723379 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:49.223352977 +0000 UTC m=+19.453164465 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.724582 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.724609 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.724907 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.725048 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.726691 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.726725 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.727073 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.732028 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.732271 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.732364 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.732514 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.734369 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.735949 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.736219 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.736278 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.736470 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.736908 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.737507 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.738335 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.738602 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.738611 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.739729 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.739904 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.741724 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.743564 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.743652 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.743757 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.744214 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.744084 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.744494 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.744769 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.744812 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.744915 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.744983 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.744922 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.745158 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.745173 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.745435 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.745721 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.746073 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.748057 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.748342 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.749074 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.752515 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.752963 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.769688 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-jsdsn"] Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.770174 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jsdsn" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.772920 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.773032 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.773145 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.773998 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.780674 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.787782 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.789941 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.792845 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.798154 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.798204 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.798965 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.803861 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.804108 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.804215 4744 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.804332 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.804500 4744 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.804583 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.804675 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.804766 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.804843 4744 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.804916 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.805004 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.805080 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.805142 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.805218 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.805323 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.807897 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.808045 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.808138 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.808203 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.808262 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.808382 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.808451 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.808515 4744 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.808568 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.808627 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.808679 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.808733 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.808788 4744 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.808846 4744 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.808897 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.808951 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.809003 4744 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.809110 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.809192 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.809273 4744 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.809370 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.809490 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.809586 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.809694 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.801360 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.800501 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.802394 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.810078 4744 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.810146 4744 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.810204 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.810259 4744 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.810344 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.810423 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.810530 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.810660 4744 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.810830 4744 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.810920 4744 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.811038 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.811148 4744 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.811265 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.811374 4744 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.811481 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.811577 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.812351 4744 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.812479 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.812590 4744 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.812688 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.812771 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.812853 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.812941 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.813026 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.813103 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.813183 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.813265 4744 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.813381 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.813475 4744 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.813551 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.813611 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.813662 4744 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.813715 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.813771 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.813826 4744 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.813876 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.813926 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.813981 4744 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814033 4744 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814083 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814142 4744 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814198 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814252 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814339 4744 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814410 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814471 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814521 4744 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814581 4744 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814631 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814681 4744 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814731 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814790 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814848 4744 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814898 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.814951 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815000 4744 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815086 4744 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815146 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815200 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815254 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815329 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815395 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815455 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815513 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815573 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815622 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815675 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815725 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815779 4744 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815833 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815887 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815941 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.815991 4744 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816040 4744 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816095 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816145 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816200 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816253 4744 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816325 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816393 4744 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816451 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816505 4744 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816555 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816605 4744 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816660 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816718 4744 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816777 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816834 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816891 4744 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816945 4744 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.816994 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817050 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817104 4744 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817155 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817208 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817262 4744 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817453 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817485 4744 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817495 4744 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817504 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817517 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817527 4744 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817538 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817549 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817558 4744 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817566 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817575 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817583 4744 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817590 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817600 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817609 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817618 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817626 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817634 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.817641 4744 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.822791 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.823325 4744 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.823367 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.827145 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.828453 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.833829 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.842370 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.845703 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.853140 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.858828 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.862888 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.873945 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.884018 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: W1205 20:10:48.884338 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-5ac90abdb57ca3504187de54aa13e5812434a5f0bdd9a5e1e81446f70b37ff3d WatchSource:0}: Error finding container 5ac90abdb57ca3504187de54aa13e5812434a5f0bdd9a5e1e81446f70b37ff3d: Status 404 returned error can't find the container with id 5ac90abdb57ca3504187de54aa13e5812434a5f0bdd9a5e1e81446f70b37ff3d Dec 05 20:10:48 crc kubenswrapper[4744]: W1205 20:10:48.885735 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-1e2a9c099bbf03e8eabe5b33f80b86aa5ab61401d567b9464a11f7c0488063b6 WatchSource:0}: Error finding container 1e2a9c099bbf03e8eabe5b33f80b86aa5ab61401d567b9464a11f7c0488063b6: Status 404 returned error can't find the container with id 1e2a9c099bbf03e8eabe5b33f80b86aa5ab61401d567b9464a11f7c0488063b6 Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.898669 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.919095 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5969bfd5-aba0-4d9f-9b90-16de741c404a-hosts-file\") pod \"node-resolver-jsdsn\" (UID: \"5969bfd5-aba0-4d9f-9b90-16de741c404a\") " pod="openshift-dns/node-resolver-jsdsn" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.919165 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff2h6\" (UniqueName: \"kubernetes.io/projected/5969bfd5-aba0-4d9f-9b90-16de741c404a-kube-api-access-ff2h6\") pod \"node-resolver-jsdsn\" (UID: \"5969bfd5-aba0-4d9f-9b90-16de741c404a\") " pod="openshift-dns/node-resolver-jsdsn" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.933915 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.948214 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.978773 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:48 crc kubenswrapper[4744]: I1205 20:10:48.993364 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.006341 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.017906 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.020272 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5969bfd5-aba0-4d9f-9b90-16de741c404a-hosts-file\") pod \"node-resolver-jsdsn\" (UID: \"5969bfd5-aba0-4d9f-9b90-16de741c404a\") " pod="openshift-dns/node-resolver-jsdsn" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.020421 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff2h6\" (UniqueName: \"kubernetes.io/projected/5969bfd5-aba0-4d9f-9b90-16de741c404a-kube-api-access-ff2h6\") pod \"node-resolver-jsdsn\" (UID: \"5969bfd5-aba0-4d9f-9b90-16de741c404a\") " pod="openshift-dns/node-resolver-jsdsn" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.020547 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5969bfd5-aba0-4d9f-9b90-16de741c404a-hosts-file\") pod \"node-resolver-jsdsn\" (UID: \"5969bfd5-aba0-4d9f-9b90-16de741c404a\") " pod="openshift-dns/node-resolver-jsdsn" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.079581 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.079695 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.079804 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.079860 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.079923 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.079990 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.120240 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.120969 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff2h6\" (UniqueName: \"kubernetes.io/projected/5969bfd5-aba0-4d9f-9b90-16de741c404a-kube-api-access-ff2h6\") pod \"node-resolver-jsdsn\" (UID: \"5969bfd5-aba0-4d9f-9b90-16de741c404a\") " pod="openshift-dns/node-resolver-jsdsn" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.144679 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.154552 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.164727 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-05 20:05:48 +0000 UTC, rotation deadline is 2026-09-16 14:46:33.984646855 +0000 UTC Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.164814 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6834h35m44.819836256s for next certificate rotation Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.221524 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.221601 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.221622 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.221700 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.221746 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:50.221734148 +0000 UTC m=+20.451545516 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.221794 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:10:50.221787139 +0000 UTC m=+20.451598507 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.221846 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.221866 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:50.221860531 +0000 UTC m=+20.451671899 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.225153 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.243012 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1" exitCode=255 Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.243095 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1"} Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.243688 4744 scope.go:117] "RemoveContainer" containerID="32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.246749 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1e2a9c099bbf03e8eabe5b33f80b86aa5ab61401d567b9464a11f7c0488063b6"} Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.255401 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5ac90abdb57ca3504187de54aa13e5812434a5f0bdd9a5e1e81446f70b37ff3d"} Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.256619 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"147588292c936cce7d3db27d79dcc8ac442e89221a5e95a45f6ee428a2089e53"} Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.259627 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.269138 4744 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.274860 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.288312 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.302451 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.314403 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.321929 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.321970 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.322083 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.322103 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.322114 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.322146 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:50.322134612 +0000 UTC m=+20.551945980 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.322233 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.322278 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.322314 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.322364 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:50.322348777 +0000 UTC m=+20.552160155 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.340847 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.354664 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.373445 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.383352 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.397017 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jsdsn" Dec 05 20:10:49 crc kubenswrapper[4744]: W1205 20:10:49.414797 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5969bfd5_aba0_4d9f_9b90_16de741c404a.slice/crio-f14d7e98f15fa3bb6be452c03dac369f486c68f695035dcf9ed2b34ef0c06c14 WatchSource:0}: Error finding container f14d7e98f15fa3bb6be452c03dac369f486c68f695035dcf9ed2b34ef0c06c14: Status 404 returned error can't find the container with id f14d7e98f15fa3bb6be452c03dac369f486c68f695035dcf9ed2b34ef0c06c14 Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.553428 4744 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 20:10:49 crc kubenswrapper[4744]: I1205 20:10:49.911361 4744 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 05 20:10:49 crc kubenswrapper[4744]: W1205 20:10:49.913123 4744 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.913948 4744 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events\": read tcp 38.102.83.51:47814->38.102.83.51:6443: use of closed network connection" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.187e6ab5397bf97e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 20:10:30.561864062 +0000 UTC m=+0.791675470,LastTimestamp:2025-12-05 20:10:30.561864062 +0000 UTC m=+0.791675470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 20:10:49 crc kubenswrapper[4744]: E1205 20:10:49.914115 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Post \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases?timeout=10s\": read tcp 38.102.83.51:47814->38.102.83.51:6443: use of closed network connection" interval="6.4s" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.089554 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.090181 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.091810 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.092568 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.093644 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.094094 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.094220 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.094899 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.095918 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.096556 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.097567 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.098051 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.099304 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.099804 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.100349 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.101277 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.101836 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.102856 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.103245 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.103636 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.103991 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.104990 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.105455 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.106429 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.106868 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.107951 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.108382 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.109008 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.110123 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.110741 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.111866 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.112357 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.113307 4744 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.113416 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.115204 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.116127 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.116555 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.118088 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.119184 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.120384 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.121022 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.122100 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.122571 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.123723 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.124399 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.125666 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.126174 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.127103 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.127730 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.129036 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.129608 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.129545 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.130633 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.131081 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.132066 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.132655 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.133113 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.145646 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.164180 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.176563 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.190761 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.201414 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.214678 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.229454 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.229545 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.229575 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:50 crc kubenswrapper[4744]: E1205 20:10:50.229701 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:10:50 crc kubenswrapper[4744]: E1205 20:10:50.229789 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:52.229770159 +0000 UTC m=+22.459581527 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:10:50 crc kubenswrapper[4744]: E1205 20:10:50.229875 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:10:52.229865271 +0000 UTC m=+22.459676639 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:10:50 crc kubenswrapper[4744]: E1205 20:10:50.229988 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:10:50 crc kubenswrapper[4744]: E1205 20:10:50.230036 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:52.230025095 +0000 UTC m=+22.459836463 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.261576 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29"} Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.261643 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f"} Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.263657 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211"} Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.266160 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.267953 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5"} Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.268498 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.270130 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jsdsn" event={"ID":"5969bfd5-aba0-4d9f-9b90-16de741c404a","Type":"ContainerStarted","Data":"0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b"} Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.270157 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jsdsn" event={"ID":"5969bfd5-aba0-4d9f-9b90-16de741c404a","Type":"ContainerStarted","Data":"f14d7e98f15fa3bb6be452c03dac369f486c68f695035dcf9ed2b34ef0c06c14"} Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.289099 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.302721 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.315502 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.329115 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.330331 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.330388 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:10:50 crc kubenswrapper[4744]: E1205 20:10:50.330565 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:10:50 crc kubenswrapper[4744]: E1205 20:10:50.330610 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:10:50 crc kubenswrapper[4744]: E1205 20:10:50.330625 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:50 crc kubenswrapper[4744]: E1205 20:10:50.330684 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:52.330666435 +0000 UTC m=+22.560477803 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:50 crc kubenswrapper[4744]: E1205 20:10:50.330744 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:10:50 crc kubenswrapper[4744]: E1205 20:10:50.330753 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:10:50 crc kubenswrapper[4744]: E1205 20:10:50.330761 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:50 crc kubenswrapper[4744]: E1205 20:10:50.330782 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:52.330775868 +0000 UTC m=+22.560587236 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.365217 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.386271 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.409555 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.426713 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.441313 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.455817 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.466680 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.477890 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.487038 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.497185 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.507345 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.523816 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.539763 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:50 crc kubenswrapper[4744]: I1205 20:10:50.549137 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.065363 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-7qlm7"] Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.065976 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-jrcln"] Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.066180 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.066760 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.068096 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.068130 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.068659 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.068926 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.069134 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.069159 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.071697 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.079812 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.079826 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:10:51 crc kubenswrapper[4744]: E1205 20:10:51.079926 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.079942 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:51 crc kubenswrapper[4744]: E1205 20:10:51.080053 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:10:51 crc kubenswrapper[4744]: E1205 20:10:51.080138 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.083282 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.096257 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.108014 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.118996 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.130318 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.141815 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.149162 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.149228 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-cnibin\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.149252 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-cnibin\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.149358 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-hostroot\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.149525 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/89bdeba9-f644-4465-a9f8-82c682f6aea3-multus-daemon-config\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.149619 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-var-lib-cni-bin\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.149699 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-run-multus-certs\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.149794 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x692c\" (UniqueName: \"kubernetes.io/projected/89bdeba9-f644-4465-a9f8-82c682f6aea3-kube-api-access-x692c\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.149886 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-run-netns\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.149965 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-var-lib-kubelet\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.150042 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-multus-cni-dir\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.150117 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.150204 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-system-cni-dir\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.150275 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-etc-kubernetes\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.150372 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r2zx\" (UniqueName: \"kubernetes.io/projected/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-kube-api-access-8r2zx\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.150468 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-run-k8s-cni-cncf-io\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.150577 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-os-release\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.150656 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-cni-binary-copy\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.150737 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-system-cni-dir\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.150811 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-var-lib-cni-multus\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.150885 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-multus-socket-dir-parent\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.150965 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-multus-conf-dir\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.151037 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/89bdeba9-f644-4465-a9f8-82c682f6aea3-cni-binary-copy\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.151104 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-os-release\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.166940 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.186466 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.197824 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.210175 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.222266 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.235329 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.248863 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251448 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-cnibin\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251478 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-cnibin\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251496 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/89bdeba9-f644-4465-a9f8-82c682f6aea3-multus-daemon-config\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251511 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-hostroot\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251525 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-var-lib-cni-bin\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251545 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-run-multus-certs\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251561 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x692c\" (UniqueName: \"kubernetes.io/projected/89bdeba9-f644-4465-a9f8-82c682f6aea3-kube-api-access-x692c\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251577 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-run-netns\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251593 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-var-lib-kubelet\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251596 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-cnibin\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251823 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-multus-cni-dir\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251873 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-cnibin\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251913 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-run-multus-certs\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251948 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-hostroot\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251955 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-run-netns\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251978 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-var-lib-cni-bin\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252015 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-var-lib-kubelet\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.251612 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-multus-cni-dir\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252187 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/89bdeba9-f644-4465-a9f8-82c682f6aea3-multus-daemon-config\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252190 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252334 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-system-cni-dir\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252423 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-run-k8s-cni-cncf-io\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252500 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-etc-kubernetes\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252572 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r2zx\" (UniqueName: \"kubernetes.io/projected/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-kube-api-access-8r2zx\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252669 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-system-cni-dir\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252740 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-var-lib-cni-multus\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252820 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-os-release\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252891 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-cni-binary-copy\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252944 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252834 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-var-lib-cni-multus\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252463 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-host-run-k8s-cni-cncf-io\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252857 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-system-cni-dir\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252432 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-system-cni-dir\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.252532 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-etc-kubernetes\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.253158 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-os-release\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.253170 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-multus-socket-dir-parent\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.253310 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-multus-socket-dir-parent\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.253318 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-multus-conf-dir\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.253464 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-os-release\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.253524 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-os-release\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.253343 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/89bdeba9-f644-4465-a9f8-82c682f6aea3-multus-conf-dir\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.253559 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-cni-binary-copy\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.253660 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/89bdeba9-f644-4465-a9f8-82c682f6aea3-cni-binary-copy\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.253750 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.254190 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/89bdeba9-f644-4465-a9f8-82c682f6aea3-cni-binary-copy\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.262472 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.277727 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x692c\" (UniqueName: \"kubernetes.io/projected/89bdeba9-f644-4465-a9f8-82c682f6aea3-kube-api-access-x692c\") pod \"multus-7qlm7\" (UID: \"89bdeba9-f644-4465-a9f8-82c682f6aea3\") " pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.279992 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r2zx\" (UniqueName: \"kubernetes.io/projected/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-kube-api-access-8r2zx\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.299456 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.313695 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.328179 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.341791 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.355500 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.370908 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.378857 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7qlm7" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.384915 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.453058 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-bkhvd"] Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.453501 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.455435 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.457132 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.457272 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.457314 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.457341 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.476985 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.490002 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.503056 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.516564 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.528528 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.538728 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.549709 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.555487 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e25986a8-4343-4c98-bc53-6c1b077661f9-mcd-auth-proxy-config\") pod \"machine-config-daemon-bkhvd\" (UID: \"e25986a8-4343-4c98-bc53-6c1b077661f9\") " pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.555530 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xzvb\" (UniqueName: \"kubernetes.io/projected/e25986a8-4343-4c98-bc53-6c1b077661f9-kube-api-access-9xzvb\") pod \"machine-config-daemon-bkhvd\" (UID: \"e25986a8-4343-4c98-bc53-6c1b077661f9\") " pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.555548 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e25986a8-4343-4c98-bc53-6c1b077661f9-proxy-tls\") pod \"machine-config-daemon-bkhvd\" (UID: \"e25986a8-4343-4c98-bc53-6c1b077661f9\") " pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.555565 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e25986a8-4343-4c98-bc53-6c1b077661f9-rootfs\") pod \"machine-config-daemon-bkhvd\" (UID: \"e25986a8-4343-4c98-bc53-6c1b077661f9\") " pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.561856 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.573706 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.584362 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.594543 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.615748 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.656230 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e25986a8-4343-4c98-bc53-6c1b077661f9-mcd-auth-proxy-config\") pod \"machine-config-daemon-bkhvd\" (UID: \"e25986a8-4343-4c98-bc53-6c1b077661f9\") " pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.656312 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xzvb\" (UniqueName: \"kubernetes.io/projected/e25986a8-4343-4c98-bc53-6c1b077661f9-kube-api-access-9xzvb\") pod \"machine-config-daemon-bkhvd\" (UID: \"e25986a8-4343-4c98-bc53-6c1b077661f9\") " pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.656349 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e25986a8-4343-4c98-bc53-6c1b077661f9-proxy-tls\") pod \"machine-config-daemon-bkhvd\" (UID: \"e25986a8-4343-4c98-bc53-6c1b077661f9\") " pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.656371 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e25986a8-4343-4c98-bc53-6c1b077661f9-rootfs\") pod \"machine-config-daemon-bkhvd\" (UID: \"e25986a8-4343-4c98-bc53-6c1b077661f9\") " pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.656437 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e25986a8-4343-4c98-bc53-6c1b077661f9-rootfs\") pod \"machine-config-daemon-bkhvd\" (UID: \"e25986a8-4343-4c98-bc53-6c1b077661f9\") " pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.657210 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e25986a8-4343-4c98-bc53-6c1b077661f9-mcd-auth-proxy-config\") pod \"machine-config-daemon-bkhvd\" (UID: \"e25986a8-4343-4c98-bc53-6c1b077661f9\") " pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.660119 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e25986a8-4343-4c98-bc53-6c1b077661f9-proxy-tls\") pod \"machine-config-daemon-bkhvd\" (UID: \"e25986a8-4343-4c98-bc53-6c1b077661f9\") " pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.671093 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xzvb\" (UniqueName: \"kubernetes.io/projected/e25986a8-4343-4c98-bc53-6c1b077661f9-kube-api-access-9xzvb\") pod \"machine-config-daemon-bkhvd\" (UID: \"e25986a8-4343-4c98-bc53-6c1b077661f9\") " pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.679431 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dcd4e5b0-9a0c-4819-9f3b-e13521e44b41-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jrcln\" (UID: \"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\") " pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.684139 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jrcln" Dec 05 20:10:51 crc kubenswrapper[4744]: W1205 20:10:51.762516 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcd4e5b0_9a0c_4819_9f3b_e13521e44b41.slice/crio-3143501af327cf0367391fe073c8b75030da5398deef2b28f22f5adca59d0b65 WatchSource:0}: Error finding container 3143501af327cf0367391fe073c8b75030da5398deef2b28f22f5adca59d0b65: Status 404 returned error can't find the container with id 3143501af327cf0367391fe073c8b75030da5398deef2b28f22f5adca59d0b65 Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.770362 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.792553 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.803650 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.803727 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.803742 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.803854 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.817876 4744 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.818179 4744 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.822266 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.822347 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.822358 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.822378 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.822392 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:51Z","lastTransitionTime":"2025-12-05T20:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.867074 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6bk4n"] Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.868107 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: E1205 20:10:51.870739 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.870867 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.871114 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.871393 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.873777 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.873874 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.874073 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.874081 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.876318 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.876347 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.876359 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.876380 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.876394 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:51Z","lastTransitionTime":"2025-12-05T20:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.882667 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: E1205 20:10:51.889694 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.897085 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.897132 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.897143 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.897159 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.897170 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:51Z","lastTransitionTime":"2025-12-05T20:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.901969 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: E1205 20:10:51.910131 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.914331 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.914364 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.914375 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.914392 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.914404 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:51Z","lastTransitionTime":"2025-12-05T20:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.917544 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: E1205 20:10:51.926779 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.932137 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.932172 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.932183 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.932203 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.932214 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:51Z","lastTransitionTime":"2025-12-05T20:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.940979 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: E1205 20:10:51.943699 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: E1205 20:10:51.943812 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.946318 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.946348 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.946358 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.946373 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.946383 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:51Z","lastTransitionTime":"2025-12-05T20:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.956889 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966379 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966427 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-node-log\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966468 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97hdl\" (UniqueName: \"kubernetes.io/projected/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-kube-api-access-97hdl\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966491 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-systemd-units\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966508 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-ovn\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966533 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966553 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-env-overrides\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966575 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovn-node-metrics-cert\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966595 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-var-lib-openvswitch\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966625 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovnkube-config\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966646 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-kubelet\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966675 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-slash\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966697 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovnkube-script-lib\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966716 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-openvswitch\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966736 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-cni-bin\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966761 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-etc-openvswitch\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966782 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-log-socket\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966803 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-cni-netd\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966832 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-run-netns\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.966858 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-systemd\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.970531 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:51 crc kubenswrapper[4744]: I1205 20:10:51.990380 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:51Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.005177 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.020265 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.033916 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.048543 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.049391 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.049438 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.049462 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.049481 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.049493 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:52Z","lastTransitionTime":"2025-12-05T20:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.067699 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovnkube-config\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.067988 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-kubelet\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068099 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-slash\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068206 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovnkube-script-lib\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068354 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-etc-openvswitch\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068464 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-openvswitch\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068569 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-cni-bin\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068144 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-slash\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068428 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-etc-openvswitch\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068504 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovnkube-config\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068534 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-openvswitch\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068650 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-cni-netd\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068678 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-cni-bin\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068106 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-kubelet\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068791 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-log-socket\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068839 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-run-netns\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068848 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-log-socket\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068862 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-systemd\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068882 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068893 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-systemd\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068900 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-node-log\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068897 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-run-netns\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068902 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovnkube-script-lib\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068919 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068937 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97hdl\" (UniqueName: \"kubernetes.io/projected/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-kube-api-access-97hdl\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068956 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-env-overrides\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068974 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-systemd-units\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068942 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-node-log\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.069006 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-ovn\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.068988 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-ovn\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.069038 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-systemd-units\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.069044 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.069072 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.069084 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovn-node-metrics-cert\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.069109 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-var-lib-openvswitch\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.069166 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-var-lib-openvswitch\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.069442 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-env-overrides\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.069513 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-cni-netd\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.073528 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.073835 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovn-node-metrics-cert\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.085203 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97hdl\" (UniqueName: \"kubernetes.io/projected/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-kube-api-access-97hdl\") pod \"ovnkube-node-6bk4n\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.092567 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.151859 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.151897 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.151910 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.151949 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.151963 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:52Z","lastTransitionTime":"2025-12-05T20:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.184587 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:52 crc kubenswrapper[4744]: W1205 20:10:52.213752 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99bea8e6_6eff_4db0_8e98_20a5ae64e0d6.slice/crio-93d028c9806d6ee200f9c1442800c265097fab978e5e2daad308c1acffa58359 WatchSource:0}: Error finding container 93d028c9806d6ee200f9c1442800c265097fab978e5e2daad308c1acffa58359: Status 404 returned error can't find the container with id 93d028c9806d6ee200f9c1442800c265097fab978e5e2daad308c1acffa58359 Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.254563 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.254615 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.254632 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.254650 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.254661 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:52Z","lastTransitionTime":"2025-12-05T20:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.271313 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.271484 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:52 crc kubenswrapper[4744]: E1205 20:10:52.271568 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:10:56.271531596 +0000 UTC m=+26.501343004 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:10:52 crc kubenswrapper[4744]: E1205 20:10:52.271635 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.271648 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:52 crc kubenswrapper[4744]: E1205 20:10:52.271698 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:56.27168047 +0000 UTC m=+26.501491948 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:10:52 crc kubenswrapper[4744]: E1205 20:10:52.271830 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:10:52 crc kubenswrapper[4744]: E1205 20:10:52.271938 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:56.271913547 +0000 UTC m=+26.501724975 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.287751 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.288985 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerStarted","Data":"93d028c9806d6ee200f9c1442800c265097fab978e5e2daad308c1acffa58359"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.291903 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerStarted","Data":"d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.291946 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerStarted","Data":"121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.291997 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerStarted","Data":"571367be4d4a3d8b615de3bebf174b7eeafb68ded3dafed9c11fa870de22e47a"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.293549 4744 generic.go:334] "Generic (PLEG): container finished" podID="dcd4e5b0-9a0c-4819-9f3b-e13521e44b41" containerID="1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4" exitCode=0 Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.293677 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" event={"ID":"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41","Type":"ContainerDied","Data":"1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.293733 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" event={"ID":"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41","Type":"ContainerStarted","Data":"3143501af327cf0367391fe073c8b75030da5398deef2b28f22f5adca59d0b65"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.295477 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7qlm7" event={"ID":"89bdeba9-f644-4465-a9f8-82c682f6aea3","Type":"ContainerStarted","Data":"07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.295509 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7qlm7" event={"ID":"89bdeba9-f644-4465-a9f8-82c682f6aea3","Type":"ContainerStarted","Data":"93a2e51c921fb323ea81ac1f662d2d115bfaddabbcdefc22e2eb7879fb22f31f"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.309197 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.327677 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.344271 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.359173 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.359206 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.359216 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.359230 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.359240 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:52Z","lastTransitionTime":"2025-12-05T20:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.367156 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.376054 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.376181 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:10:52 crc kubenswrapper[4744]: E1205 20:10:52.376618 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:10:52 crc kubenswrapper[4744]: E1205 20:10:52.376649 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:10:52 crc kubenswrapper[4744]: E1205 20:10:52.376660 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:52 crc kubenswrapper[4744]: E1205 20:10:52.376714 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:56.376696228 +0000 UTC m=+26.606507596 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:52 crc kubenswrapper[4744]: E1205 20:10:52.377131 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:10:52 crc kubenswrapper[4744]: E1205 20:10:52.377155 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:10:52 crc kubenswrapper[4744]: E1205 20:10:52.377166 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:52 crc kubenswrapper[4744]: E1205 20:10:52.377209 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:10:56.377193221 +0000 UTC m=+26.607004699 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.383246 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.396457 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.407857 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.418535 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.436332 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.451185 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.461851 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.461917 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.461930 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.461972 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.461989 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:52Z","lastTransitionTime":"2025-12-05T20:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.463163 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.475836 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.485132 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.494445 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.505356 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.514997 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.528687 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.538792 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.562287 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.564516 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.564572 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.564591 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.564617 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.564635 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:52Z","lastTransitionTime":"2025-12-05T20:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.578057 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.623870 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.661074 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.666854 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.666921 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.666969 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.667002 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.667025 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:52Z","lastTransitionTime":"2025-12-05T20:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.704464 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.742794 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.769104 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.769159 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.769177 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.769199 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.769216 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:52Z","lastTransitionTime":"2025-12-05T20:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.783135 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.821149 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:52Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.871904 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.871954 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.871968 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.871988 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.871999 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:52Z","lastTransitionTime":"2025-12-05T20:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.974475 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.974543 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.974561 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.974585 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:52 crc kubenswrapper[4744]: I1205 20:10:52.974605 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:52Z","lastTransitionTime":"2025-12-05T20:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.018156 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-9dddz"] Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.018904 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9dddz" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.021836 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.022417 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.022514 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.022815 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.050165 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.066205 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.077022 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.077062 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.077075 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.077090 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.077102 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:53Z","lastTransitionTime":"2025-12-05T20:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.080456 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:53 crc kubenswrapper[4744]: E1205 20:10:53.080549 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.080642 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:10:53 crc kubenswrapper[4744]: E1205 20:10:53.080715 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.080771 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:10:53 crc kubenswrapper[4744]: E1205 20:10:53.080911 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.081115 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.087396 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlf5g\" (UniqueName: \"kubernetes.io/projected/df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee-kube-api-access-zlf5g\") pod \"node-ca-9dddz\" (UID: \"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\") " pod="openshift-image-registry/node-ca-9dddz" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.087459 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee-serviceca\") pod \"node-ca-9dddz\" (UID: \"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\") " pod="openshift-image-registry/node-ca-9dddz" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.087490 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee-host\") pod \"node-ca-9dddz\" (UID: \"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\") " pod="openshift-image-registry/node-ca-9dddz" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.096122 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.106490 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.134640 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.179477 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.179511 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.179519 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.179540 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.179551 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:53Z","lastTransitionTime":"2025-12-05T20:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.187914 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee-serviceca\") pod \"node-ca-9dddz\" (UID: \"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\") " pod="openshift-image-registry/node-ca-9dddz" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.187958 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee-host\") pod \"node-ca-9dddz\" (UID: \"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\") " pod="openshift-image-registry/node-ca-9dddz" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.187993 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlf5g\" (UniqueName: \"kubernetes.io/projected/df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee-kube-api-access-zlf5g\") pod \"node-ca-9dddz\" (UID: \"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\") " pod="openshift-image-registry/node-ca-9dddz" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.188146 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee-host\") pod \"node-ca-9dddz\" (UID: \"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\") " pod="openshift-image-registry/node-ca-9dddz" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.188996 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee-serviceca\") pod \"node-ca-9dddz\" (UID: \"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\") " pod="openshift-image-registry/node-ca-9dddz" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.189052 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.225602 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlf5g\" (UniqueName: \"kubernetes.io/projected/df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee-kube-api-access-zlf5g\") pod \"node-ca-9dddz\" (UID: \"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\") " pod="openshift-image-registry/node-ca-9dddz" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.237793 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.279407 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.282663 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.282709 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.282720 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.282738 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.282748 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:53Z","lastTransitionTime":"2025-12-05T20:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.300061 4744 generic.go:334] "Generic (PLEG): container finished" podID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerID="a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a" exitCode=0 Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.300160 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerDied","Data":"a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a"} Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.302748 4744 generic.go:334] "Generic (PLEG): container finished" podID="dcd4e5b0-9a0c-4819-9f3b-e13521e44b41" containerID="50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3" exitCode=0 Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.302836 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" event={"ID":"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41","Type":"ContainerDied","Data":"50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3"} Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.317195 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.358766 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9dddz" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.367704 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: W1205 20:10:53.376654 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf8d9ec8_e8fd_4d2f_bd06_0d082a38e4ee.slice/crio-50bc0e5797457b557365c601d0102c7a1b2ee63e848c60e42edd3d85d79c155e WatchSource:0}: Error finding container 50bc0e5797457b557365c601d0102c7a1b2ee63e848c60e42edd3d85d79c155e: Status 404 returned error can't find the container with id 50bc0e5797457b557365c601d0102c7a1b2ee63e848c60e42edd3d85d79c155e Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.389004 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.389033 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.389041 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.389056 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.389066 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:53Z","lastTransitionTime":"2025-12-05T20:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.399702 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.438896 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.487390 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.493119 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.493189 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.493202 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.493243 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.493256 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:53Z","lastTransitionTime":"2025-12-05T20:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.524499 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.564808 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.595401 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.595443 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.595453 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.595466 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.595475 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:53Z","lastTransitionTime":"2025-12-05T20:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.597646 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.638108 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.678330 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.697987 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.698022 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.698033 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.698049 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.698060 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:53Z","lastTransitionTime":"2025-12-05T20:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.720265 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.759632 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.798505 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.800184 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.800217 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.800228 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.800245 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.800256 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:53Z","lastTransitionTime":"2025-12-05T20:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.839443 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.880034 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.902258 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.902312 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.902320 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.902335 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.902344 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:53Z","lastTransitionTime":"2025-12-05T20:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.923113 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:53 crc kubenswrapper[4744]: I1205 20:10:53.960994 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.000048 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.004547 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.004586 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.004596 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.004611 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.004621 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:54Z","lastTransitionTime":"2025-12-05T20:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.043561 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.107398 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.107446 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.107458 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.107475 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.107489 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:54Z","lastTransitionTime":"2025-12-05T20:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.209995 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.210041 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.210051 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.210072 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.210119 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:54Z","lastTransitionTime":"2025-12-05T20:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.310199 4744 generic.go:334] "Generic (PLEG): container finished" podID="dcd4e5b0-9a0c-4819-9f3b-e13521e44b41" containerID="5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294" exitCode=0 Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.310377 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" event={"ID":"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41","Type":"ContainerDied","Data":"5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.313740 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.313784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.313802 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.313825 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.313842 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:54Z","lastTransitionTime":"2025-12-05T20:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.322827 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerStarted","Data":"9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.322917 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerStarted","Data":"bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.322964 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerStarted","Data":"0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.322990 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerStarted","Data":"6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.323016 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerStarted","Data":"97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.323041 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerStarted","Data":"fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.330736 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.333196 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9dddz" event={"ID":"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee","Type":"ContainerStarted","Data":"4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.333258 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9dddz" event={"ID":"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee","Type":"ContainerStarted","Data":"50bc0e5797457b557365c601d0102c7a1b2ee63e848c60e42edd3d85d79c155e"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.343842 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.363110 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.385811 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.401831 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.416256 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.416317 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.416329 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.416347 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.416361 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:54Z","lastTransitionTime":"2025-12-05T20:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.420632 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.438744 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.456989 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.472648 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.488160 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.506343 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.519083 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.519118 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.519127 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.519142 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.519152 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:54Z","lastTransitionTime":"2025-12-05T20:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.521326 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.557964 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.601829 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.621418 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.621442 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.621450 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.621461 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.621469 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:54Z","lastTransitionTime":"2025-12-05T20:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.640203 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.680714 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.721617 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.723551 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.723605 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.723623 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.723645 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.723661 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:54Z","lastTransitionTime":"2025-12-05T20:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.772034 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.803015 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.826612 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.826647 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.826659 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.826676 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.826689 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:54Z","lastTransitionTime":"2025-12-05T20:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.840557 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.878817 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.918961 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.928797 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.928864 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.928879 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.928898 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.928910 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:54Z","lastTransitionTime":"2025-12-05T20:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.960801 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:54Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.990127 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:54 crc kubenswrapper[4744]: I1205 20:10:54.994341 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.005847 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.017274 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.032330 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.032397 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.032419 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.032445 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.032464 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:55Z","lastTransitionTime":"2025-12-05T20:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.063476 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.079831 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.079868 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.079832 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:55 crc kubenswrapper[4744]: E1205 20:10:55.079989 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:10:55 crc kubenswrapper[4744]: E1205 20:10:55.080110 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:10:55 crc kubenswrapper[4744]: E1205 20:10:55.080377 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.104574 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.136613 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.136689 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.136738 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.136768 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.136797 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:55Z","lastTransitionTime":"2025-12-05T20:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.147159 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.195589 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.225428 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.240794 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.240856 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.240874 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.240903 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.240920 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:55Z","lastTransitionTime":"2025-12-05T20:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.265764 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.307955 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.342397 4744 generic.go:334] "Generic (PLEG): container finished" podID="dcd4e5b0-9a0c-4819-9f3b-e13521e44b41" containerID="be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526" exitCode=0 Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.342537 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" event={"ID":"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41","Type":"ContainerDied","Data":"be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526"} Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.343035 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.343104 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.343149 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.343188 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.343212 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:55Z","lastTransitionTime":"2025-12-05T20:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.374931 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.403374 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.425369 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.445577 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.445634 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.445652 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.445675 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.445693 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:55Z","lastTransitionTime":"2025-12-05T20:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.467934 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.508142 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.545676 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.549947 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.550000 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.550016 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.550041 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.550270 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:55Z","lastTransitionTime":"2025-12-05T20:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.587452 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.624184 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.651933 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.651968 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.651977 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.651991 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.652001 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:55Z","lastTransitionTime":"2025-12-05T20:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.660049 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.700643 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.741956 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.754202 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.754246 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.754257 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.754275 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.754286 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:55Z","lastTransitionTime":"2025-12-05T20:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.794512 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.823887 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.856526 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.856558 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.856567 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.856582 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.856591 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:55Z","lastTransitionTime":"2025-12-05T20:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.861404 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.916971 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.948704 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.959392 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.959443 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.959457 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.959473 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.959487 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:55Z","lastTransitionTime":"2025-12-05T20:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:55 crc kubenswrapper[4744]: I1205 20:10:55.978084 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:55Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.022642 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.056436 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.062661 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.062720 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.062737 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.062758 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.062774 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:56Z","lastTransitionTime":"2025-12-05T20:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.098499 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.138559 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.165653 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.165698 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.165710 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.165728 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.165740 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:56Z","lastTransitionTime":"2025-12-05T20:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.194551 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.218647 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.258973 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.267889 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.267939 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.267953 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.267974 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.267989 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:56Z","lastTransitionTime":"2025-12-05T20:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.304384 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.316050 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.316170 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.316211 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:56 crc kubenswrapper[4744]: E1205 20:10:56.316278 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:10:56 crc kubenswrapper[4744]: E1205 20:10:56.316351 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:04.316334384 +0000 UTC m=+34.546145752 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:10:56 crc kubenswrapper[4744]: E1205 20:10:56.316373 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:11:04.316361955 +0000 UTC m=+34.546173333 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:10:56 crc kubenswrapper[4744]: E1205 20:10:56.316438 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:10:56 crc kubenswrapper[4744]: E1205 20:10:56.316575 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:04.316540569 +0000 UTC m=+34.546351997 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.339737 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.353705 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerStarted","Data":"efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915"} Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.357794 4744 generic.go:334] "Generic (PLEG): container finished" podID="dcd4e5b0-9a0c-4819-9f3b-e13521e44b41" containerID="f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040" exitCode=0 Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.357846 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" event={"ID":"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41","Type":"ContainerDied","Data":"f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040"} Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.371133 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.371182 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.371199 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.371223 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.371240 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:56Z","lastTransitionTime":"2025-12-05T20:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.388437 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.417919 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.418021 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:10:56 crc kubenswrapper[4744]: E1205 20:10:56.418337 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:10:56 crc kubenswrapper[4744]: E1205 20:10:56.418388 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:10:56 crc kubenswrapper[4744]: E1205 20:10:56.418410 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:56 crc kubenswrapper[4744]: E1205 20:10:56.418438 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:10:56 crc kubenswrapper[4744]: E1205 20:10:56.418486 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:10:56 crc kubenswrapper[4744]: E1205 20:10:56.418496 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:04.418471242 +0000 UTC m=+34.648282670 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:56 crc kubenswrapper[4744]: E1205 20:10:56.418514 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:56 crc kubenswrapper[4744]: E1205 20:10:56.418610 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:04.418565724 +0000 UTC m=+34.648377182 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.420253 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.455982 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.473346 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.473373 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.473385 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.473400 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.473409 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:56Z","lastTransitionTime":"2025-12-05T20:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.498162 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.542196 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.575839 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.575891 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.575905 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.575922 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.575934 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:56Z","lastTransitionTime":"2025-12-05T20:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.576901 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.618748 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.661420 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.678452 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.678482 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.678494 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.678513 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.678526 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:56Z","lastTransitionTime":"2025-12-05T20:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.706489 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.747382 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.779047 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.781018 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.781070 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.781082 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.781102 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.781115 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:56Z","lastTransitionTime":"2025-12-05T20:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.822107 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.873879 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.883724 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.883781 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.883809 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.883842 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.883867 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:56Z","lastTransitionTime":"2025-12-05T20:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.904950 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.940804 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.981979 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:56Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.987069 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.987103 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.987114 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.987128 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:56 crc kubenswrapper[4744]: I1205 20:10:56.987140 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:56Z","lastTransitionTime":"2025-12-05T20:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.079677 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.079871 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:10:57 crc kubenswrapper[4744]: E1205 20:10:57.079938 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:10:57 crc kubenswrapper[4744]: E1205 20:10:57.080085 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.079712 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:10:57 crc kubenswrapper[4744]: E1205 20:10:57.080237 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.094616 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.094716 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.094739 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.094804 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.094823 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:57Z","lastTransitionTime":"2025-12-05T20:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.197709 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.197782 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.197796 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.197848 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.197866 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:57Z","lastTransitionTime":"2025-12-05T20:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.301960 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.302029 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.302067 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.302104 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.302126 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:57Z","lastTransitionTime":"2025-12-05T20:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.367672 4744 generic.go:334] "Generic (PLEG): container finished" podID="dcd4e5b0-9a0c-4819-9f3b-e13521e44b41" containerID="7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40" exitCode=0 Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.367743 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" event={"ID":"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41","Type":"ContainerDied","Data":"7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40"} Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.407083 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.407539 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.407553 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.407573 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.407588 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:57Z","lastTransitionTime":"2025-12-05T20:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.409448 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.432509 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.448858 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.467752 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.491095 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.505860 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.511160 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.511204 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.511216 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.511247 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.511259 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:57Z","lastTransitionTime":"2025-12-05T20:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.525560 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.554190 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.568350 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.583549 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.598467 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.612409 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.615520 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.615584 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.615598 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.615617 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.615629 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:57Z","lastTransitionTime":"2025-12-05T20:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.627205 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.639027 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.651198 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:57Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.718776 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.718833 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.718864 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.718890 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.718902 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:57Z","lastTransitionTime":"2025-12-05T20:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.822177 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.822251 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.822271 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.822352 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.822373 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:57Z","lastTransitionTime":"2025-12-05T20:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.925849 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.925917 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.925934 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.926005 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:57 crc kubenswrapper[4744]: I1205 20:10:57.926023 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:57Z","lastTransitionTime":"2025-12-05T20:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.029115 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.029189 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.029213 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.029243 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.029265 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:58Z","lastTransitionTime":"2025-12-05T20:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.132831 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.132884 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.132901 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.132924 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.132942 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:58Z","lastTransitionTime":"2025-12-05T20:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.236837 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.236898 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.236915 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.236940 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.236958 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:58Z","lastTransitionTime":"2025-12-05T20:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.340365 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.340445 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.340468 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.340502 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.340523 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:58Z","lastTransitionTime":"2025-12-05T20:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.443381 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.443445 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.443468 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.443497 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.443519 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:58Z","lastTransitionTime":"2025-12-05T20:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.546583 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.546638 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.546655 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.546678 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.546695 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:58Z","lastTransitionTime":"2025-12-05T20:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.649597 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.649680 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.649695 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.649713 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.649727 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:58Z","lastTransitionTime":"2025-12-05T20:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.752422 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.752489 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.752508 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.752537 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.752555 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:58Z","lastTransitionTime":"2025-12-05T20:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.856163 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.856231 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.856254 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.856285 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.856351 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:58Z","lastTransitionTime":"2025-12-05T20:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.959895 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.959972 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.959990 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.960018 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:58 crc kubenswrapper[4744]: I1205 20:10:58.960042 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:58Z","lastTransitionTime":"2025-12-05T20:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.064421 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.064495 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.064523 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.064555 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.064579 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:59Z","lastTransitionTime":"2025-12-05T20:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.080280 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.080434 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.080336 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:10:59 crc kubenswrapper[4744]: E1205 20:10:59.080563 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:10:59 crc kubenswrapper[4744]: E1205 20:10:59.080734 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:10:59 crc kubenswrapper[4744]: E1205 20:10:59.080871 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.167078 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.167130 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.167141 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.167158 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.167171 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:59Z","lastTransitionTime":"2025-12-05T20:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.270367 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.270423 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.270447 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.270469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.270485 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:59Z","lastTransitionTime":"2025-12-05T20:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.373939 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.374023 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.374047 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.374082 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.374103 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:59Z","lastTransitionTime":"2025-12-05T20:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.383081 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerStarted","Data":"624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89"} Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.383661 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.383709 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.383733 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.388944 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" event={"ID":"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41","Type":"ContainerStarted","Data":"5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718"} Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.405224 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.424258 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.427105 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.427190 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.445088 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.464501 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.478482 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.478552 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.478571 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.478595 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.478614 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:59Z","lastTransitionTime":"2025-12-05T20:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.496658 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.513433 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.527496 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.550773 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.581993 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.582210 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.582394 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.582559 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.582691 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:59Z","lastTransitionTime":"2025-12-05T20:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.587113 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.603913 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.617786 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.636690 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.655601 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.671433 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.685548 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.685614 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.685626 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.685644 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.685656 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:59Z","lastTransitionTime":"2025-12-05T20:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.691875 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.716951 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.735258 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.752510 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.767000 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.788836 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.788894 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.788907 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.788926 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.788941 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:59Z","lastTransitionTime":"2025-12-05T20:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.796783 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.816529 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.834182 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.856644 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.887314 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.891333 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.891371 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.891384 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.891402 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.891415 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:59Z","lastTransitionTime":"2025-12-05T20:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.908789 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.926561 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.944115 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.961565 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.980486 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.994722 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.994806 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.994833 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.994866 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.994893 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:10:59Z","lastTransitionTime":"2025-12-05T20:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:10:59 crc kubenswrapper[4744]: I1205 20:10:59.998735 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:10:59Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.097941 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.097988 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.097999 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.098050 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.098063 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:00Z","lastTransitionTime":"2025-12-05T20:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.103698 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.124637 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.142373 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.161967 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.192647 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.201225 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.201277 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.201319 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.201345 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.201368 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:00Z","lastTransitionTime":"2025-12-05T20:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.213674 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.232988 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.255489 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.286966 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.305493 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.305555 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.305574 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.305598 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.305616 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:00Z","lastTransitionTime":"2025-12-05T20:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.307029 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.327584 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.354862 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.393628 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.407598 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.408865 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.408905 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.408923 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.408948 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.408965 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:00Z","lastTransitionTime":"2025-12-05T20:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.426690 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.511937 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.512002 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.512015 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.512037 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.512050 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:00Z","lastTransitionTime":"2025-12-05T20:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.614273 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.614363 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.614381 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.614405 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.614422 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:00Z","lastTransitionTime":"2025-12-05T20:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.717634 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.717682 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.717700 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.717724 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.717742 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:00Z","lastTransitionTime":"2025-12-05T20:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.820346 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.820418 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.820436 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.820464 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.820481 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:00Z","lastTransitionTime":"2025-12-05T20:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.923868 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.923921 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.923941 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.923965 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:00 crc kubenswrapper[4744]: I1205 20:11:00.923983 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:00Z","lastTransitionTime":"2025-12-05T20:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.026528 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.026572 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.026584 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.026601 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.026613 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:01Z","lastTransitionTime":"2025-12-05T20:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.080441 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:01 crc kubenswrapper[4744]: E1205 20:11:01.080615 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.081154 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:01 crc kubenswrapper[4744]: E1205 20:11:01.081269 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.081380 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:01 crc kubenswrapper[4744]: E1205 20:11:01.081457 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.130195 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.130253 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.130266 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.130324 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.130338 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:01Z","lastTransitionTime":"2025-12-05T20:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.233603 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.233669 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.233687 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.234109 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.234166 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:01Z","lastTransitionTime":"2025-12-05T20:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.336858 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.336913 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.336934 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.336962 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.336982 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:01Z","lastTransitionTime":"2025-12-05T20:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.440558 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.440614 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.440631 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.440653 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.440671 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:01Z","lastTransitionTime":"2025-12-05T20:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.543174 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.543237 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.543256 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.543280 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.543332 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:01Z","lastTransitionTime":"2025-12-05T20:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.646661 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.646733 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.646758 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.646788 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.646812 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:01Z","lastTransitionTime":"2025-12-05T20:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.750384 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.750484 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.750509 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.750535 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.750551 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:01Z","lastTransitionTime":"2025-12-05T20:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.854560 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.854953 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.855225 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.855489 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.855757 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:01Z","lastTransitionTime":"2025-12-05T20:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.959235 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.959615 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.959741 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.959877 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:01 crc kubenswrapper[4744]: I1205 20:11:01.960053 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:01Z","lastTransitionTime":"2025-12-05T20:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.063332 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.063371 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.063383 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.063400 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.063411 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:02Z","lastTransitionTime":"2025-12-05T20:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.065923 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.065972 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.065989 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.066029 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.066046 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:02Z","lastTransitionTime":"2025-12-05T20:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:02 crc kubenswrapper[4744]: E1205 20:11:02.094752 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.100164 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.100228 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.100252 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.100282 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.100371 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:02Z","lastTransitionTime":"2025-12-05T20:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:02 crc kubenswrapper[4744]: E1205 20:11:02.122037 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.127010 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.127137 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.127162 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.127192 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.127215 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:02Z","lastTransitionTime":"2025-12-05T20:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:02 crc kubenswrapper[4744]: E1205 20:11:02.148660 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.152396 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.152595 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.152664 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.152724 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.152778 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:02Z","lastTransitionTime":"2025-12-05T20:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:02 crc kubenswrapper[4744]: E1205 20:11:02.174004 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.179185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.179310 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.179371 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.179433 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.179488 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:02Z","lastTransitionTime":"2025-12-05T20:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:02 crc kubenswrapper[4744]: E1205 20:11:02.197174 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: E1205 20:11:02.197463 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.199661 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.199729 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.199751 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.199778 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.199799 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:02Z","lastTransitionTime":"2025-12-05T20:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.301850 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.301900 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.301912 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.301930 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.301942 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:02Z","lastTransitionTime":"2025-12-05T20:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.403758 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovnkube-controller/0.log" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.404098 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.404145 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.404162 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.404185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.404202 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:02Z","lastTransitionTime":"2025-12-05T20:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.407643 4744 generic.go:334] "Generic (PLEG): container finished" podID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerID="624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89" exitCode=1 Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.407693 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerDied","Data":"624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89"} Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.408752 4744 scope.go:117] "RemoveContainer" containerID="624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.418890 4744 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.433507 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.456991 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.470364 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.482365 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.506018 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.506053 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.506064 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.506080 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.506091 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:02Z","lastTransitionTime":"2025-12-05T20:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.513507 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.531848 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.561649 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.603028 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.608233 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.608264 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.608274 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.608300 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.608312 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:02Z","lastTransitionTime":"2025-12-05T20:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.619185 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.629818 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.641560 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.662537 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\" 6047 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 20:11:01.175802 6047 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:11:01.175897 6047 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:11:01.175975 6047 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:11:01.175987 6047 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:11:01.176039 6047 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:11:01.176058 6047 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:11:01.176114 6047 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:11:01.176134 6047 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:11:01.176160 6047 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:11:01.176157 6047 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:11:01.176183 6047 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:11:01.176202 6047 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:11:01.176230 6047 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:11:01.176122 6047 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:11:01.176212 6047 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.681660 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.694833 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.711481 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.711567 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.711592 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.711615 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.711634 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:02Z","lastTransitionTime":"2025-12-05T20:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.714883 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:02Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.814122 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.814155 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.814165 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.814178 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.814187 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:02Z","lastTransitionTime":"2025-12-05T20:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.916967 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.917089 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.917133 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.917177 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:02 crc kubenswrapper[4744]: I1205 20:11:02.917199 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:02Z","lastTransitionTime":"2025-12-05T20:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.020664 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.020716 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.020733 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.020755 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.020772 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:03Z","lastTransitionTime":"2025-12-05T20:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.080223 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.080223 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:03 crc kubenswrapper[4744]: E1205 20:11:03.080825 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:03 crc kubenswrapper[4744]: E1205 20:11:03.081013 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.081235 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:03 crc kubenswrapper[4744]: E1205 20:11:03.081454 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.124153 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.124264 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.124346 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.124382 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.124404 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:03Z","lastTransitionTime":"2025-12-05T20:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.226710 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.226764 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.226782 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.226807 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.226825 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:03Z","lastTransitionTime":"2025-12-05T20:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.304596 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm"] Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.305252 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.307553 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.309021 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.329684 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.329749 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.329771 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.329800 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.329821 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:03Z","lastTransitionTime":"2025-12-05T20:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.332755 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.349182 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.363551 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.379503 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.391374 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9867a450-a95a-41ea-9d64-21f01814ed73-env-overrides\") pod \"ovnkube-control-plane-749d76644c-m2rtm\" (UID: \"9867a450-a95a-41ea-9d64-21f01814ed73\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.391432 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9867a450-a95a-41ea-9d64-21f01814ed73-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-m2rtm\" (UID: \"9867a450-a95a-41ea-9d64-21f01814ed73\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.391501 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wltcz\" (UniqueName: \"kubernetes.io/projected/9867a450-a95a-41ea-9d64-21f01814ed73-kube-api-access-wltcz\") pod \"ovnkube-control-plane-749d76644c-m2rtm\" (UID: \"9867a450-a95a-41ea-9d64-21f01814ed73\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.391543 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9867a450-a95a-41ea-9d64-21f01814ed73-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-m2rtm\" (UID: \"9867a450-a95a-41ea-9d64-21f01814ed73\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.399462 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.413972 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovnkube-controller/0.log" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.418706 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerStarted","Data":"6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896"} Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.419580 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.420806 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.432576 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.432630 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.432646 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.432666 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.432682 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:03Z","lastTransitionTime":"2025-12-05T20:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.441581 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.455444 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.478336 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\" 6047 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 20:11:01.175802 6047 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:11:01.175897 6047 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:11:01.175975 6047 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:11:01.175987 6047 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:11:01.176039 6047 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:11:01.176058 6047 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:11:01.176114 6047 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:11:01.176134 6047 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:11:01.176160 6047 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:11:01.176157 6047 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:11:01.176183 6047 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:11:01.176202 6047 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:11:01.176230 6047 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:11:01.176122 6047 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:11:01.176212 6047 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.492524 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9867a450-a95a-41ea-9d64-21f01814ed73-env-overrides\") pod \"ovnkube-control-plane-749d76644c-m2rtm\" (UID: \"9867a450-a95a-41ea-9d64-21f01814ed73\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.492616 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9867a450-a95a-41ea-9d64-21f01814ed73-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-m2rtm\" (UID: \"9867a450-a95a-41ea-9d64-21f01814ed73\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.492759 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wltcz\" (UniqueName: \"kubernetes.io/projected/9867a450-a95a-41ea-9d64-21f01814ed73-kube-api-access-wltcz\") pod \"ovnkube-control-plane-749d76644c-m2rtm\" (UID: \"9867a450-a95a-41ea-9d64-21f01814ed73\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.492819 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9867a450-a95a-41ea-9d64-21f01814ed73-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-m2rtm\" (UID: \"9867a450-a95a-41ea-9d64-21f01814ed73\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.493871 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9867a450-a95a-41ea-9d64-21f01814ed73-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-m2rtm\" (UID: \"9867a450-a95a-41ea-9d64-21f01814ed73\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.494356 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9867a450-a95a-41ea-9d64-21f01814ed73-env-overrides\") pod \"ovnkube-control-plane-749d76644c-m2rtm\" (UID: \"9867a450-a95a-41ea-9d64-21f01814ed73\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.497887 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.499148 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9867a450-a95a-41ea-9d64-21f01814ed73-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-m2rtm\" (UID: \"9867a450-a95a-41ea-9d64-21f01814ed73\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.517730 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.520645 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wltcz\" (UniqueName: \"kubernetes.io/projected/9867a450-a95a-41ea-9d64-21f01814ed73-kube-api-access-wltcz\") pod \"ovnkube-control-plane-749d76644c-m2rtm\" (UID: \"9867a450-a95a-41ea-9d64-21f01814ed73\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.535509 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.535584 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.535610 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.535641 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.535664 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:03Z","lastTransitionTime":"2025-12-05T20:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.540835 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.565627 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.587427 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.605470 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.625043 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.625010 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.639373 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.639426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.639443 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.639467 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.639484 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:03Z","lastTransitionTime":"2025-12-05T20:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.650414 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: W1205 20:11:03.651609 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9867a450_a95a_41ea_9d64_21f01814ed73.slice/crio-34fca29df683e2951ec9b3ef206363a41bc2f84e5e75dbbda14c9c9d84d2044a WatchSource:0}: Error finding container 34fca29df683e2951ec9b3ef206363a41bc2f84e5e75dbbda14c9c9d84d2044a: Status 404 returned error can't find the container with id 34fca29df683e2951ec9b3ef206363a41bc2f84e5e75dbbda14c9c9d84d2044a Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.683202 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\" 6047 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 20:11:01.175802 6047 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:11:01.175897 6047 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:11:01.175975 6047 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:11:01.175987 6047 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:11:01.176039 6047 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:11:01.176058 6047 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:11:01.176114 6047 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:11:01.176134 6047 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:11:01.176160 6047 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:11:01.176157 6047 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:11:01.176183 6047 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:11:01.176202 6047 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:11:01.176230 6047 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:11:01.176122 6047 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:11:01.176212 6047 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.701410 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.724155 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.741911 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.741954 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.741967 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.741987 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.742002 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:03Z","lastTransitionTime":"2025-12-05T20:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.748750 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.768444 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.787903 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.804835 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.821283 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.841753 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.845743 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.845784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.845796 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.845816 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.845829 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:03Z","lastTransitionTime":"2025-12-05T20:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.869363 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.888039 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.899726 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.911909 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.923972 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.932912 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:03Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.948014 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.948065 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.948082 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.948101 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:03 crc kubenswrapper[4744]: I1205 20:11:03.948116 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:03Z","lastTransitionTime":"2025-12-05T20:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.050984 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.051028 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.051039 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.051085 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.051097 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:04Z","lastTransitionTime":"2025-12-05T20:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.153281 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.153562 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.153662 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.153745 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.153826 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:04Z","lastTransitionTime":"2025-12-05T20:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.256822 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.257051 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.257112 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.257170 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.257228 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:04Z","lastTransitionTime":"2025-12-05T20:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.359905 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.359947 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.359959 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.359976 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.359990 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:04Z","lastTransitionTime":"2025-12-05T20:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.404724 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.404935 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:04 crc kubenswrapper[4744]: E1205 20:11:04.404959 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:11:20.404922612 +0000 UTC m=+50.634734020 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.405048 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:04 crc kubenswrapper[4744]: E1205 20:11:04.405092 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:11:04 crc kubenswrapper[4744]: E1205 20:11:04.405214 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:20.405187629 +0000 UTC m=+50.634999077 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:11:04 crc kubenswrapper[4744]: E1205 20:11:04.405241 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:11:04 crc kubenswrapper[4744]: E1205 20:11:04.405373 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:20.405342753 +0000 UTC m=+50.635154191 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.424575 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" event={"ID":"9867a450-a95a-41ea-9d64-21f01814ed73","Type":"ContainerStarted","Data":"34fca29df683e2951ec9b3ef206363a41bc2f84e5e75dbbda14c9c9d84d2044a"} Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.462771 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.463030 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.463117 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.463301 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.463416 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:04Z","lastTransitionTime":"2025-12-05T20:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.505639 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.505704 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:04 crc kubenswrapper[4744]: E1205 20:11:04.505909 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:11:04 crc kubenswrapper[4744]: E1205 20:11:04.505937 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:11:04 crc kubenswrapper[4744]: E1205 20:11:04.505948 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:11:04 crc kubenswrapper[4744]: E1205 20:11:04.506001 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:20.505987565 +0000 UTC m=+50.735798933 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:11:04 crc kubenswrapper[4744]: E1205 20:11:04.505999 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:11:04 crc kubenswrapper[4744]: E1205 20:11:04.506041 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:11:04 crc kubenswrapper[4744]: E1205 20:11:04.506066 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:11:04 crc kubenswrapper[4744]: E1205 20:11:04.506157 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:20.506130108 +0000 UTC m=+50.735941536 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.566498 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.566558 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.566570 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.566590 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.566601 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:04Z","lastTransitionTime":"2025-12-05T20:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.669772 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.669846 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.669866 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.669892 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.669917 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:04Z","lastTransitionTime":"2025-12-05T20:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.772397 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.772449 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.772465 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.772485 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.772501 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:04Z","lastTransitionTime":"2025-12-05T20:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.860793 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-cgjbb"] Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.861269 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:04 crc kubenswrapper[4744]: E1205 20:11:04.861366 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.876058 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.876100 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.876113 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.876132 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.876144 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:04Z","lastTransitionTime":"2025-12-05T20:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.877156 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.906145 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\" 6047 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 20:11:01.175802 6047 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:11:01.175897 6047 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:11:01.175975 6047 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:11:01.175987 6047 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:11:01.176039 6047 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:11:01.176058 6047 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:11:01.176114 6047 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:11:01.176134 6047 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:11:01.176160 6047 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:11:01.176157 6047 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:11:01.176183 6047 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:11:01.176202 6047 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:11:01.176230 6047 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:11:01.176122 6047 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:11:01.176212 6047 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.921593 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.938522 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.955507 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.979072 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.979120 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.979132 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.979150 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.979162 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:04Z","lastTransitionTime":"2025-12-05T20:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:04 crc kubenswrapper[4744]: I1205 20:11:04.984702 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.008051 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.011418 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs\") pod \"network-metrics-daemon-cgjbb\" (UID: \"9d0c84c8-b581-47ce-8cb8-956d3ef79238\") " pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.011494 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csrlv\" (UniqueName: \"kubernetes.io/projected/9d0c84c8-b581-47ce-8cb8-956d3ef79238-kube-api-access-csrlv\") pod \"network-metrics-daemon-cgjbb\" (UID: \"9d0c84c8-b581-47ce-8cb8-956d3ef79238\") " pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.023221 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.035607 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.050678 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.062382 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.080257 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.080341 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:05 crc kubenswrapper[4744]: E1205 20:11:05.080427 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.080352 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:05 crc kubenswrapper[4744]: E1205 20:11:05.080755 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:05 crc kubenswrapper[4744]: E1205 20:11:05.080921 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.082677 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.082715 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.082728 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.082746 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.082760 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:05Z","lastTransitionTime":"2025-12-05T20:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.082941 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.098429 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.112231 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.112824 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csrlv\" (UniqueName: \"kubernetes.io/projected/9d0c84c8-b581-47ce-8cb8-956d3ef79238-kube-api-access-csrlv\") pod \"network-metrics-daemon-cgjbb\" (UID: \"9d0c84c8-b581-47ce-8cb8-956d3ef79238\") " pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.112897 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs\") pod \"network-metrics-daemon-cgjbb\" (UID: \"9d0c84c8-b581-47ce-8cb8-956d3ef79238\") " pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:05 crc kubenswrapper[4744]: E1205 20:11:05.113103 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:11:05 crc kubenswrapper[4744]: E1205 20:11:05.113196 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs podName:9d0c84c8-b581-47ce-8cb8-956d3ef79238 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:05.61317128 +0000 UTC m=+35.842982668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs") pod "network-metrics-daemon-cgjbb" (UID: "9d0c84c8-b581-47ce-8cb8-956d3ef79238") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.125426 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.134871 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csrlv\" (UniqueName: \"kubernetes.io/projected/9d0c84c8-b581-47ce-8cb8-956d3ef79238-kube-api-access-csrlv\") pod \"network-metrics-daemon-cgjbb\" (UID: \"9d0c84c8-b581-47ce-8cb8-956d3ef79238\") " pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.138579 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.151069 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.185998 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.186109 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.186134 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.186173 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.186196 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:05Z","lastTransitionTime":"2025-12-05T20:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.289131 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.289211 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.289234 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.289270 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.289324 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:05Z","lastTransitionTime":"2025-12-05T20:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.392634 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.392696 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.392719 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.392745 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.392765 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:05Z","lastTransitionTime":"2025-12-05T20:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.430138 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovnkube-controller/1.log" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.430845 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovnkube-controller/0.log" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.433768 4744 generic.go:334] "Generic (PLEG): container finished" podID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerID="6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896" exitCode=1 Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.433824 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerDied","Data":"6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896"} Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.433882 4744 scope.go:117] "RemoveContainer" containerID="624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.435242 4744 scope.go:117] "RemoveContainer" containerID="6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896" Dec 05 20:11:05 crc kubenswrapper[4744]: E1205 20:11:05.435809 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.436278 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" event={"ID":"9867a450-a95a-41ea-9d64-21f01814ed73","Type":"ContainerStarted","Data":"aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f"} Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.436369 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" event={"ID":"9867a450-a95a-41ea-9d64-21f01814ed73","Type":"ContainerStarted","Data":"420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9"} Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.469935 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.487273 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.495070 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.495119 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.495132 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.495151 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.495163 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:05Z","lastTransitionTime":"2025-12-05T20:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.496910 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.508386 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.516316 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.527932 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.539630 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.550057 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.568152 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.578588 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.596231 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.597538 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.597601 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.597621 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.597649 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.597671 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:05Z","lastTransitionTime":"2025-12-05T20:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.611418 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.617925 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs\") pod \"network-metrics-daemon-cgjbb\" (UID: \"9d0c84c8-b581-47ce-8cb8-956d3ef79238\") " pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:05 crc kubenswrapper[4744]: E1205 20:11:05.618107 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:11:05 crc kubenswrapper[4744]: E1205 20:11:05.618182 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs podName:9d0c84c8-b581-47ce-8cb8-956d3ef79238 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:06.618159793 +0000 UTC m=+36.847971201 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs") pod "network-metrics-daemon-cgjbb" (UID: "9d0c84c8-b581-47ce-8cb8-956d3ef79238") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.638214 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\" 6047 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 20:11:01.175802 6047 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:11:01.175897 6047 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:11:01.175975 6047 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:11:01.175987 6047 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:11:01.176039 6047 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:11:01.176058 6047 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:11:01.176114 6047 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:11:01.176134 6047 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:11:01.176160 6047 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:11:01.176157 6047 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:11:01.176183 6047 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:11:01.176202 6047 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:11:01.176230 6047 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:11:01.176122 6047 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:11:01.176212 6047 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"}]\\\\nI1205 20:11:04.509779 6183 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:11:04.509780 6183 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 20:11:04.509966 6183 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 20:11:04.510009 6183 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:11:04.510032 6183 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 20:11:04.510100 6183 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.655239 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.673648 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.693142 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.704276 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.705226 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.705445 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.705887 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.706533 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:05Z","lastTransitionTime":"2025-12-05T20:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.722096 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.743394 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.764708 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.787460 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.810889 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.810952 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.810971 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.810997 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.811019 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:05Z","lastTransitionTime":"2025-12-05T20:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.820735 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.836093 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.849070 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.852199 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.870406 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.887622 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.907625 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.913724 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.913776 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.913796 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.913823 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.913841 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:05Z","lastTransitionTime":"2025-12-05T20:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.923420 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.938448 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.960135 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.978435 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:05 crc kubenswrapper[4744]: I1205 20:11:05.999005 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:05Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.016349 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.017108 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.017179 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.017204 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.017232 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.017256 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:06Z","lastTransitionTime":"2025-12-05T20:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.042063 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\" 6047 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 20:11:01.175802 6047 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:11:01.175897 6047 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:11:01.175975 6047 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:11:01.175987 6047 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:11:01.176039 6047 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:11:01.176058 6047 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:11:01.176114 6047 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:11:01.176134 6047 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:11:01.176160 6047 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:11:01.176157 6047 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:11:01.176183 6047 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:11:01.176202 6047 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:11:01.176230 6047 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:11:01.176122 6047 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:11:01.176212 6047 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"}]\\\\nI1205 20:11:04.509779 6183 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:11:04.509780 6183 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 20:11:04.509966 6183 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 20:11:04.510009 6183 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:11:04.510032 6183 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 20:11:04.510100 6183 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.061154 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.075452 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.079872 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:06 crc kubenswrapper[4744]: E1205 20:11:06.080112 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.091889 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.114194 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.119852 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.119897 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.119914 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.119977 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.120001 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:06Z","lastTransitionTime":"2025-12-05T20:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.142701 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.166380 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.184688 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.222588 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.222667 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.222689 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.222721 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.222743 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:06Z","lastTransitionTime":"2025-12-05T20:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.236033 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.252388 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.272547 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.284571 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.297758 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.319269 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.325548 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.325601 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.325619 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.325642 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.325660 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:06Z","lastTransitionTime":"2025-12-05T20:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.335165 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.358168 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.372776 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.395614 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\" 6047 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 20:11:01.175802 6047 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:11:01.175897 6047 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:11:01.175975 6047 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:11:01.175987 6047 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:11:01.176039 6047 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:11:01.176058 6047 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:11:01.176114 6047 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:11:01.176134 6047 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:11:01.176160 6047 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:11:01.176157 6047 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:11:01.176183 6047 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:11:01.176202 6047 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:11:01.176230 6047 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:11:01.176122 6047 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:11:01.176212 6047 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"}]\\\\nI1205 20:11:04.509779 6183 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:11:04.509780 6183 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 20:11:04.509966 6183 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 20:11:04.510009 6183 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:11:04.510032 6183 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 20:11:04.510100 6183 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.410462 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:06Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.427942 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.427999 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.428018 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.428042 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.428059 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:06Z","lastTransitionTime":"2025-12-05T20:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.445703 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovnkube-controller/1.log" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.531426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.531482 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.531501 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.531524 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.531542 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:06Z","lastTransitionTime":"2025-12-05T20:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.626852 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs\") pod \"network-metrics-daemon-cgjbb\" (UID: \"9d0c84c8-b581-47ce-8cb8-956d3ef79238\") " pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:06 crc kubenswrapper[4744]: E1205 20:11:06.627035 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:11:06 crc kubenswrapper[4744]: E1205 20:11:06.627128 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs podName:9d0c84c8-b581-47ce-8cb8-956d3ef79238 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:08.627103214 +0000 UTC m=+38.856914612 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs") pod "network-metrics-daemon-cgjbb" (UID: "9d0c84c8-b581-47ce-8cb8-956d3ef79238") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.634713 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.634771 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.634830 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.634858 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.634911 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:06Z","lastTransitionTime":"2025-12-05T20:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.738594 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.738952 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.739166 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.739423 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.739634 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:06Z","lastTransitionTime":"2025-12-05T20:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.842658 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.842736 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.842754 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.842778 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.842795 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:06Z","lastTransitionTime":"2025-12-05T20:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.945378 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.945718 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.945879 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.946065 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:06 crc kubenswrapper[4744]: I1205 20:11:06.946243 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:06Z","lastTransitionTime":"2025-12-05T20:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.050361 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.050432 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.050454 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.050486 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.050509 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:07Z","lastTransitionTime":"2025-12-05T20:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.080090 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:07 crc kubenswrapper[4744]: E1205 20:11:07.080637 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.080463 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:07 crc kubenswrapper[4744]: E1205 20:11:07.081740 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.080430 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:07 crc kubenswrapper[4744]: E1205 20:11:07.082331 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.153807 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.153874 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.153897 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.153926 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.153948 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:07Z","lastTransitionTime":"2025-12-05T20:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.256846 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.256897 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.256915 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.256940 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.256957 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:07Z","lastTransitionTime":"2025-12-05T20:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.360101 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.360177 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.360198 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.360250 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.360268 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:07Z","lastTransitionTime":"2025-12-05T20:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.463208 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.463272 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.463333 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.463367 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.463388 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:07Z","lastTransitionTime":"2025-12-05T20:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.566371 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.566434 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.566457 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.566485 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.566510 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:07Z","lastTransitionTime":"2025-12-05T20:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.670190 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.670259 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.670281 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.670356 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.670382 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:07Z","lastTransitionTime":"2025-12-05T20:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.773450 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.773510 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.773528 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.773562 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.773579 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:07Z","lastTransitionTime":"2025-12-05T20:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.876216 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.876329 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.876361 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.876455 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.876484 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:07Z","lastTransitionTime":"2025-12-05T20:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.979408 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.979455 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.979472 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.979494 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:07 crc kubenswrapper[4744]: I1205 20:11:07.979510 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:07Z","lastTransitionTime":"2025-12-05T20:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.080448 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:08 crc kubenswrapper[4744]: E1205 20:11:08.080642 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.084100 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.084232 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.084252 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.084275 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.084334 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:08Z","lastTransitionTime":"2025-12-05T20:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.187483 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.187541 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.187558 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.187581 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.187598 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:08Z","lastTransitionTime":"2025-12-05T20:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.290612 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.290674 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.290696 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.290723 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.290744 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:08Z","lastTransitionTime":"2025-12-05T20:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.393740 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.393800 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.393817 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.393841 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.393859 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:08Z","lastTransitionTime":"2025-12-05T20:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.496866 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.496918 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.496935 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.496958 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.496977 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:08Z","lastTransitionTime":"2025-12-05T20:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.600534 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.600586 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.600603 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.600627 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.600644 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:08Z","lastTransitionTime":"2025-12-05T20:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.651378 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs\") pod \"network-metrics-daemon-cgjbb\" (UID: \"9d0c84c8-b581-47ce-8cb8-956d3ef79238\") " pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:08 crc kubenswrapper[4744]: E1205 20:11:08.651563 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:11:08 crc kubenswrapper[4744]: E1205 20:11:08.651674 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs podName:9d0c84c8-b581-47ce-8cb8-956d3ef79238 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:12.651648972 +0000 UTC m=+42.881460350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs") pod "network-metrics-daemon-cgjbb" (UID: "9d0c84c8-b581-47ce-8cb8-956d3ef79238") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.703396 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.703428 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.703476 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.703497 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.703510 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:08Z","lastTransitionTime":"2025-12-05T20:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.806795 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.806858 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.806878 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.806902 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.806923 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:08Z","lastTransitionTime":"2025-12-05T20:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.910008 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.910072 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.910090 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.910115 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:08 crc kubenswrapper[4744]: I1205 20:11:08.910133 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:08Z","lastTransitionTime":"2025-12-05T20:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.013414 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.013463 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.013480 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.013506 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.013522 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:09Z","lastTransitionTime":"2025-12-05T20:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.079808 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.079813 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.079864 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:09 crc kubenswrapper[4744]: E1205 20:11:09.080525 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:09 crc kubenswrapper[4744]: E1205 20:11:09.080987 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:09 crc kubenswrapper[4744]: E1205 20:11:09.081084 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.116836 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.116880 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.116896 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.116918 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.116935 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:09Z","lastTransitionTime":"2025-12-05T20:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.220027 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.220588 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.220819 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.221027 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.221456 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:09Z","lastTransitionTime":"2025-12-05T20:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.325012 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.325061 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.325080 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.325103 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.325121 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:09Z","lastTransitionTime":"2025-12-05T20:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.429186 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.429277 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.429657 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.430092 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.430520 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:09Z","lastTransitionTime":"2025-12-05T20:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.534943 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.535696 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.535746 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.535777 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.535799 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:09Z","lastTransitionTime":"2025-12-05T20:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.638998 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.639136 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.639158 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.639183 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.639202 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:09Z","lastTransitionTime":"2025-12-05T20:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.742881 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.742927 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.742945 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.742969 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.742987 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:09Z","lastTransitionTime":"2025-12-05T20:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.846746 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.847261 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.847597 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.847827 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.848054 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:09Z","lastTransitionTime":"2025-12-05T20:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.951506 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.951572 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.951595 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.951626 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:09 crc kubenswrapper[4744]: I1205 20:11:09.951647 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:09Z","lastTransitionTime":"2025-12-05T20:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.054392 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.054441 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.054453 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.054467 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.054478 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:10Z","lastTransitionTime":"2025-12-05T20:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.080355 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:10 crc kubenswrapper[4744]: E1205 20:11:10.081326 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.095234 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.125385 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://624d3c81bff26819be8b49ca9a8d679459364ef111387b345945f3bb7c245d89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:02Z\\\",\\\"message\\\":\\\" 6047 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 20:11:01.175802 6047 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 20:11:01.175897 6047 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 20:11:01.175975 6047 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:11:01.175987 6047 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 20:11:01.176039 6047 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 20:11:01.176058 6047 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 20:11:01.176114 6047 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:11:01.176134 6047 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 20:11:01.176160 6047 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 20:11:01.176157 6047 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 20:11:01.176183 6047 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 20:11:01.176202 6047 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:11:01.176230 6047 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:11:01.176122 6047 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:11:01.176212 6047 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"}]\\\\nI1205 20:11:04.509779 6183 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:11:04.509780 6183 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 20:11:04.509966 6183 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 20:11:04.510009 6183 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:11:04.510032 6183 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 20:11:04.510100 6183 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.142828 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.156784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.157143 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.157428 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.157682 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.158357 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:10Z","lastTransitionTime":"2025-12-05T20:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.164108 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.180945 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.198971 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.211373 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.230471 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.242188 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.259746 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.262974 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.263018 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.263032 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.263047 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.263056 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:10Z","lastTransitionTime":"2025-12-05T20:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.279587 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.313124 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.349282 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.365284 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.365371 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.365391 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.365414 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.365429 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:10Z","lastTransitionTime":"2025-12-05T20:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.371933 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.386030 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.397085 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.406845 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.468165 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.468243 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.468266 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.468332 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.468353 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:10Z","lastTransitionTime":"2025-12-05T20:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.571851 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.571901 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.571919 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.571941 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.571962 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:10Z","lastTransitionTime":"2025-12-05T20:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.674723 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.674793 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.674817 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.674846 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.674867 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:10Z","lastTransitionTime":"2025-12-05T20:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.778381 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.778446 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.778467 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.778497 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.778518 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:10Z","lastTransitionTime":"2025-12-05T20:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.881106 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.881160 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.881178 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.881254 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.881285 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:10Z","lastTransitionTime":"2025-12-05T20:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.984027 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.984100 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.984118 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.984141 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:10 crc kubenswrapper[4744]: I1205 20:11:10.984157 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:10Z","lastTransitionTime":"2025-12-05T20:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.080200 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.080228 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.080426 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:11 crc kubenswrapper[4744]: E1205 20:11:11.080601 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:11 crc kubenswrapper[4744]: E1205 20:11:11.080811 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:11 crc kubenswrapper[4744]: E1205 20:11:11.081035 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.087610 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.087659 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.087670 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.087686 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.087700 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:11Z","lastTransitionTime":"2025-12-05T20:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.190437 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.190518 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.190541 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.190572 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.190595 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:11Z","lastTransitionTime":"2025-12-05T20:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.293974 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.294052 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.294069 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.294094 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.294111 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:11Z","lastTransitionTime":"2025-12-05T20:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.396776 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.396822 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.396830 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.396844 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.396853 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:11Z","lastTransitionTime":"2025-12-05T20:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.499005 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.499067 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.499085 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.499108 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.499126 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:11Z","lastTransitionTime":"2025-12-05T20:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.602145 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.602206 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.602226 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.602251 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.602269 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:11Z","lastTransitionTime":"2025-12-05T20:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.705498 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.705565 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.705588 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.705618 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.705640 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:11Z","lastTransitionTime":"2025-12-05T20:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.808825 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.808886 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.808902 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.808925 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.808942 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:11Z","lastTransitionTime":"2025-12-05T20:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.911772 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.911839 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.911863 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.911894 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:11 crc kubenswrapper[4744]: I1205 20:11:11.911916 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:11Z","lastTransitionTime":"2025-12-05T20:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.014587 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.014663 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.014682 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.014708 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.014726 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:12Z","lastTransitionTime":"2025-12-05T20:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.079771 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:12 crc kubenswrapper[4744]: E1205 20:11:12.080018 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.117989 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.118050 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.118067 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.118088 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.118107 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:12Z","lastTransitionTime":"2025-12-05T20:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.221604 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.221672 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.221697 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.221725 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.221749 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:12Z","lastTransitionTime":"2025-12-05T20:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.324977 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.325046 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.325067 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.325091 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.325110 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:12Z","lastTransitionTime":"2025-12-05T20:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.428631 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.428696 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.428718 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.428744 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.428765 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:12Z","lastTransitionTime":"2025-12-05T20:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.485834 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.485929 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.485957 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.485991 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.486016 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:12Z","lastTransitionTime":"2025-12-05T20:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:12 crc kubenswrapper[4744]: E1205 20:11:12.507548 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.512535 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.512604 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.512629 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.512662 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.512684 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:12Z","lastTransitionTime":"2025-12-05T20:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:12 crc kubenswrapper[4744]: E1205 20:11:12.531326 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.537175 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.537443 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.537677 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.537860 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.537990 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:12Z","lastTransitionTime":"2025-12-05T20:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:12 crc kubenswrapper[4744]: E1205 20:11:12.556398 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.561443 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.561502 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.561525 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.561551 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.561569 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:12Z","lastTransitionTime":"2025-12-05T20:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:12 crc kubenswrapper[4744]: E1205 20:11:12.582460 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.588042 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.588116 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.588137 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.588165 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.588183 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:12Z","lastTransitionTime":"2025-12-05T20:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:12 crc kubenswrapper[4744]: E1205 20:11:12.611089 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:12Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:12 crc kubenswrapper[4744]: E1205 20:11:12.611694 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.614473 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.614528 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.614546 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.614571 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.614588 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:12Z","lastTransitionTime":"2025-12-05T20:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.702762 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs\") pod \"network-metrics-daemon-cgjbb\" (UID: \"9d0c84c8-b581-47ce-8cb8-956d3ef79238\") " pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:12 crc kubenswrapper[4744]: E1205 20:11:12.703037 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:11:12 crc kubenswrapper[4744]: E1205 20:11:12.703438 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs podName:9d0c84c8-b581-47ce-8cb8-956d3ef79238 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:20.703405135 +0000 UTC m=+50.933216543 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs") pod "network-metrics-daemon-cgjbb" (UID: "9d0c84c8-b581-47ce-8cb8-956d3ef79238") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.717433 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.718037 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.718073 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.718115 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.718148 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:12Z","lastTransitionTime":"2025-12-05T20:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.822233 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.822360 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.822383 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.822412 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.822435 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:12Z","lastTransitionTime":"2025-12-05T20:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.926068 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.926142 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.926165 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.926198 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:12 crc kubenswrapper[4744]: I1205 20:11:12.926221 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:12Z","lastTransitionTime":"2025-12-05T20:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.029914 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.029977 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.030000 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.030030 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.030050 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:13Z","lastTransitionTime":"2025-12-05T20:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.079952 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.079961 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:13 crc kubenswrapper[4744]: E1205 20:11:13.080200 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.079974 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:13 crc kubenswrapper[4744]: E1205 20:11:13.080325 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:13 crc kubenswrapper[4744]: E1205 20:11:13.080391 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.133620 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.133774 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.133794 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.133819 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.133838 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:13Z","lastTransitionTime":"2025-12-05T20:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.236473 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.236509 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.236518 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.236532 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.236541 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:13Z","lastTransitionTime":"2025-12-05T20:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.340136 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.340184 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.340197 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.340213 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.340225 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:13Z","lastTransitionTime":"2025-12-05T20:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.442855 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.442920 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.442940 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.442966 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.442984 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:13Z","lastTransitionTime":"2025-12-05T20:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.546124 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.546521 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.546653 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.546791 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.546922 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:13Z","lastTransitionTime":"2025-12-05T20:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.649961 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.650123 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.650144 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.650167 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.650185 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:13Z","lastTransitionTime":"2025-12-05T20:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.753857 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.753906 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.753923 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.753945 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.753964 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:13Z","lastTransitionTime":"2025-12-05T20:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.856727 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.856797 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.856819 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.856849 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.856870 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:13Z","lastTransitionTime":"2025-12-05T20:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.960592 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.961004 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.961154 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.961376 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:13 crc kubenswrapper[4744]: I1205 20:11:13.961531 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:13Z","lastTransitionTime":"2025-12-05T20:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.066384 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.066455 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.066472 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.066496 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.066514 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:14Z","lastTransitionTime":"2025-12-05T20:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.080088 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:14 crc kubenswrapper[4744]: E1205 20:11:14.080373 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.170282 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.170346 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.170357 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.170374 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.170388 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:14Z","lastTransitionTime":"2025-12-05T20:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.273351 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.273801 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.273966 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.274132 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.274284 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:14Z","lastTransitionTime":"2025-12-05T20:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.377594 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.377766 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.377825 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.377858 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.377880 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:14Z","lastTransitionTime":"2025-12-05T20:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.480077 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.480124 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.480143 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.480167 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.480184 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:14Z","lastTransitionTime":"2025-12-05T20:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.583248 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.583347 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.583365 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.583390 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.583408 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:14Z","lastTransitionTime":"2025-12-05T20:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.686578 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.686626 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.686683 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.686711 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.686730 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:14Z","lastTransitionTime":"2025-12-05T20:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.789114 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.789182 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.789202 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.789232 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.789253 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:14Z","lastTransitionTime":"2025-12-05T20:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.892382 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.892472 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.892506 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.892534 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.892554 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:14Z","lastTransitionTime":"2025-12-05T20:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.995605 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.995663 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.995675 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.995693 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:14 crc kubenswrapper[4744]: I1205 20:11:14.995705 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:14Z","lastTransitionTime":"2025-12-05T20:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.079684 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.079776 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:15 crc kubenswrapper[4744]: E1205 20:11:15.079879 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:15 crc kubenswrapper[4744]: E1205 20:11:15.080012 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.080148 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:15 crc kubenswrapper[4744]: E1205 20:11:15.080268 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.099179 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.099241 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.099260 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.099323 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.099345 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:15Z","lastTransitionTime":"2025-12-05T20:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.203197 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.203823 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.203994 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.204155 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.204351 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:15Z","lastTransitionTime":"2025-12-05T20:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.308332 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.309083 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.309272 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.309483 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.309627 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:15Z","lastTransitionTime":"2025-12-05T20:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.413239 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.413381 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.413401 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.413424 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.413442 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:15Z","lastTransitionTime":"2025-12-05T20:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.516905 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.516949 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.516965 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.516987 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.517009 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:15Z","lastTransitionTime":"2025-12-05T20:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.620077 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.620460 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.620626 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.620793 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.620922 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:15Z","lastTransitionTime":"2025-12-05T20:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.723955 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.724005 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.724026 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.724049 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.724066 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:15Z","lastTransitionTime":"2025-12-05T20:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.827631 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.827684 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.827706 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.827736 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.827758 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:15Z","lastTransitionTime":"2025-12-05T20:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.930715 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.931119 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.931371 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.931611 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:15 crc kubenswrapper[4744]: I1205 20:11:15.931844 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:15Z","lastTransitionTime":"2025-12-05T20:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.035527 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.035583 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.035598 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.035621 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.035638 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:16Z","lastTransitionTime":"2025-12-05T20:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.079612 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:16 crc kubenswrapper[4744]: E1205 20:11:16.079807 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.080926 4744 scope.go:117] "RemoveContainer" containerID="6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.120997 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.138725 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.142510 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.142563 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.142577 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.142596 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.142609 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:16Z","lastTransitionTime":"2025-12-05T20:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.150756 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.165481 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.177016 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.190354 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.200869 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.219883 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"}]\\\\nI1205 20:11:04.509779 6183 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:11:04.509780 6183 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 20:11:04.509966 6183 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 20:11:04.510009 6183 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:11:04.510032 6183 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 20:11:04.510100 6183 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.230794 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.244416 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.245431 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.245502 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.245513 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.245532 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.245562 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:16Z","lastTransitionTime":"2025-12-05T20:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.261789 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.277412 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.299350 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.316735 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.330351 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.344504 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.348629 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.348655 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.348666 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.348681 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.348691 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:16Z","lastTransitionTime":"2025-12-05T20:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.359363 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.450972 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.451019 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.451029 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.451042 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.451051 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:16Z","lastTransitionTime":"2025-12-05T20:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.489057 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovnkube-controller/1.log" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.491714 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerStarted","Data":"6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245"} Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.492264 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.508471 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.527871 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.544324 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.554831 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.554928 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.554954 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.554987 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.555009 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:16Z","lastTransitionTime":"2025-12-05T20:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.564197 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.578794 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.595773 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.612062 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.660592 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.660625 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.660634 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.660648 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.660659 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:16Z","lastTransitionTime":"2025-12-05T20:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.664961 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"}]\\\\nI1205 20:11:04.509779 6183 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:11:04.509780 6183 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 20:11:04.509966 6183 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 20:11:04.510009 6183 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:11:04.510032 6183 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 20:11:04.510100 6183 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.676069 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.687157 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.707047 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.724853 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.757566 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.763144 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.763177 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.763190 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.763207 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.763220 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:16Z","lastTransitionTime":"2025-12-05T20:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.776090 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.790164 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.804667 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.817920 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:16Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.866089 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.867426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.867560 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.868123 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.868249 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:16Z","lastTransitionTime":"2025-12-05T20:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.970019 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.970077 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.970089 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.970102 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:16 crc kubenswrapper[4744]: I1205 20:11:16.970112 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:16Z","lastTransitionTime":"2025-12-05T20:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.072555 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.072635 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.072861 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.072887 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.072902 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:17Z","lastTransitionTime":"2025-12-05T20:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.080541 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.080586 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:17 crc kubenswrapper[4744]: E1205 20:11:17.080653 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:17 crc kubenswrapper[4744]: E1205 20:11:17.080750 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.080848 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:17 crc kubenswrapper[4744]: E1205 20:11:17.080946 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.176085 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.176159 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.176178 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.176199 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.176212 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:17Z","lastTransitionTime":"2025-12-05T20:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.278469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.278536 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.278591 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.278618 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.278638 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:17Z","lastTransitionTime":"2025-12-05T20:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.382163 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.382218 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.382236 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.382260 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.382276 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:17Z","lastTransitionTime":"2025-12-05T20:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.485640 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.485703 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.485720 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.485743 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.485760 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:17Z","lastTransitionTime":"2025-12-05T20:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.589697 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.589764 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.589780 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.589803 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.589821 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:17Z","lastTransitionTime":"2025-12-05T20:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.693039 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.693115 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.693132 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.693157 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.693177 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:17Z","lastTransitionTime":"2025-12-05T20:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.796402 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.796480 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.796497 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.796521 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.796538 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:17Z","lastTransitionTime":"2025-12-05T20:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.899790 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.899873 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.899895 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.899923 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:17 crc kubenswrapper[4744]: I1205 20:11:17.899945 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:17Z","lastTransitionTime":"2025-12-05T20:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.003258 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.003397 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.003418 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.003489 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.003518 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:18Z","lastTransitionTime":"2025-12-05T20:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.080373 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:18 crc kubenswrapper[4744]: E1205 20:11:18.080584 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.106978 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.107034 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.107052 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.107080 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.107096 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:18Z","lastTransitionTime":"2025-12-05T20:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.209898 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.209984 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.210004 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.210034 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.210053 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:18Z","lastTransitionTime":"2025-12-05T20:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.313426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.313479 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.313495 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.313521 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.313539 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:18Z","lastTransitionTime":"2025-12-05T20:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.417168 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.417246 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.417270 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.417346 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.417371 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:18Z","lastTransitionTime":"2025-12-05T20:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.502790 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovnkube-controller/2.log" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.503954 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovnkube-controller/1.log" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.508696 4744 generic.go:334] "Generic (PLEG): container finished" podID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerID="6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245" exitCode=1 Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.508757 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerDied","Data":"6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245"} Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.508819 4744 scope.go:117] "RemoveContainer" containerID="6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.510088 4744 scope.go:117] "RemoveContainer" containerID="6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245" Dec 05 20:11:18 crc kubenswrapper[4744]: E1205 20:11:18.510485 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.524468 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.524544 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.524571 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.524922 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.524970 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:18Z","lastTransitionTime":"2025-12-05T20:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.532222 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.566994 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.587515 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.608556 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.627960 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.628000 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.628013 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.628029 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.628041 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:18Z","lastTransitionTime":"2025-12-05T20:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.628010 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.643267 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.658182 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.669392 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.696934 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"}]\\\\nI1205 20:11:04.509779 6183 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:11:04.509780 6183 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 20:11:04.509966 6183 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 20:11:04.510009 6183 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:11:04.510032 6183 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 20:11:04.510100 6183 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:17Z\\\",\\\"message\\\":\\\" ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1205 20:11:17.150964 6380 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}\\\\nI1205 20:11:17.150995 6380 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-scheduler-operator for network=default : 9.599849ms\\\\nI1205 20:11:17.150653 6380 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF1205 20:11:17.151004 6380 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.712637 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.730333 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.732159 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.732230 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.732256 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.732322 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.732348 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:18Z","lastTransitionTime":"2025-12-05T20:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.749896 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.769420 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.785691 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.808967 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.830350 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.834933 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.835101 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.835202 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.835363 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.835471 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:18Z","lastTransitionTime":"2025-12-05T20:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.850605 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:18Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.939933 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.940536 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.940576 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.940605 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:18 crc kubenswrapper[4744]: I1205 20:11:18.940623 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:18Z","lastTransitionTime":"2025-12-05T20:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.045013 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.045080 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.045098 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.045121 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.045138 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:19Z","lastTransitionTime":"2025-12-05T20:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.080487 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.080528 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:19 crc kubenswrapper[4744]: E1205 20:11:19.080641 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.080504 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:19 crc kubenswrapper[4744]: E1205 20:11:19.080746 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:19 crc kubenswrapper[4744]: E1205 20:11:19.080836 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.148261 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.148384 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.148403 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.148430 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.148447 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:19Z","lastTransitionTime":"2025-12-05T20:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.252202 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.252260 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.252277 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.252328 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.252347 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:19Z","lastTransitionTime":"2025-12-05T20:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.355210 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.355378 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.355407 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.355439 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.355465 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:19Z","lastTransitionTime":"2025-12-05T20:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.458806 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.458885 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.458912 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.458939 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.458956 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:19Z","lastTransitionTime":"2025-12-05T20:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.516819 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovnkube-controller/2.log" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.561779 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.561841 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.561857 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.561880 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.561898 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:19Z","lastTransitionTime":"2025-12-05T20:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.665918 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.666073 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.666097 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.666125 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.666175 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:19Z","lastTransitionTime":"2025-12-05T20:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.769036 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.769130 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.769149 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.769177 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.769195 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:19Z","lastTransitionTime":"2025-12-05T20:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.872785 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.872855 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.872872 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.872896 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.872913 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:19Z","lastTransitionTime":"2025-12-05T20:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.976320 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.976386 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.976406 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.976426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:19 crc kubenswrapper[4744]: I1205 20:11:19.976442 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:19Z","lastTransitionTime":"2025-12-05T20:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.079951 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.080184 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.080502 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.080568 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.080587 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.080614 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.080631 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:20Z","lastTransitionTime":"2025-12-05T20:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.096609 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.115755 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.136276 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.152235 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.167872 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.183750 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.184145 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.184227 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.184322 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.184412 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:20Z","lastTransitionTime":"2025-12-05T20:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.184688 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.215162 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.232206 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.240996 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.252737 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.263287 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.275006 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.284942 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.286726 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.286748 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.286757 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.286771 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.286780 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:20Z","lastTransitionTime":"2025-12-05T20:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.306450 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"}]\\\\nI1205 20:11:04.509779 6183 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:11:04.509780 6183 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 20:11:04.509966 6183 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 20:11:04.510009 6183 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:11:04.510032 6183 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 20:11:04.510100 6183 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:17Z\\\",\\\"message\\\":\\\" ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1205 20:11:17.150964 6380 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}\\\\nI1205 20:11:17.150995 6380 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-scheduler-operator for network=default : 9.599849ms\\\\nI1205 20:11:17.150653 6380 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF1205 20:11:17.151004 6380 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.321441 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.344131 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.359911 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:20Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.390100 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.390144 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.390160 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.390183 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.390200 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:20Z","lastTransitionTime":"2025-12-05T20:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.492896 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.492947 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.492957 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.492970 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.492981 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:20Z","lastTransitionTime":"2025-12-05T20:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.496614 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.496762 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:11:52.496742548 +0000 UTC m=+82.726553906 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.496841 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.496894 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.496993 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.497011 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.497030 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:52.497023745 +0000 UTC m=+82.726835113 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.497090 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:52.497068496 +0000 UTC m=+82.726879894 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.596174 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.596245 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.596270 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.596332 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.596356 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:20Z","lastTransitionTime":"2025-12-05T20:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.597857 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.597963 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.598146 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.598188 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.598190 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.598216 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.598228 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.598238 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.598351 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:52.598328393 +0000 UTC m=+82.828139802 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.598383 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:52.598371465 +0000 UTC m=+82.828182863 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.698947 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.699176 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.699207 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.699239 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.699265 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:20Z","lastTransitionTime":"2025-12-05T20:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.800981 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs\") pod \"network-metrics-daemon-cgjbb\" (UID: \"9d0c84c8-b581-47ce-8cb8-956d3ef79238\") " pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.801218 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:11:20 crc kubenswrapper[4744]: E1205 20:11:20.801350 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs podName:9d0c84c8-b581-47ce-8cb8-956d3ef79238 nodeName:}" failed. No retries permitted until 2025-12-05 20:11:36.80131837 +0000 UTC m=+67.031129778 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs") pod "network-metrics-daemon-cgjbb" (UID: "9d0c84c8-b581-47ce-8cb8-956d3ef79238") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.802428 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.802492 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.802511 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.802536 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.802552 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:20Z","lastTransitionTime":"2025-12-05T20:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.905231 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.905336 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.905361 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.905389 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:20 crc kubenswrapper[4744]: I1205 20:11:20.905405 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:20Z","lastTransitionTime":"2025-12-05T20:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.008756 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.008836 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.008855 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.008879 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.008897 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:21Z","lastTransitionTime":"2025-12-05T20:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.080080 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.080216 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.080117 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:21 crc kubenswrapper[4744]: E1205 20:11:21.080347 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:21 crc kubenswrapper[4744]: E1205 20:11:21.080520 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:21 crc kubenswrapper[4744]: E1205 20:11:21.080685 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.112335 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.112387 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.112403 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.112428 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.112447 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:21Z","lastTransitionTime":"2025-12-05T20:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.214758 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.214809 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.214824 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.214844 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.214861 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:21Z","lastTransitionTime":"2025-12-05T20:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.317651 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.317721 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.317745 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.317775 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.317798 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:21Z","lastTransitionTime":"2025-12-05T20:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.421599 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.421663 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.421685 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.421714 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.421734 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:21Z","lastTransitionTime":"2025-12-05T20:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.525075 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.525263 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.525328 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.525358 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.525375 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:21Z","lastTransitionTime":"2025-12-05T20:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.628146 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.628204 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.628221 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.628244 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.628262 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:21Z","lastTransitionTime":"2025-12-05T20:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.731420 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.731490 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.731507 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.731533 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.731550 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:21Z","lastTransitionTime":"2025-12-05T20:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.835367 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.835431 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.835452 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.835477 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.835494 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:21Z","lastTransitionTime":"2025-12-05T20:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.940209 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.940279 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.940323 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.940346 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:21 crc kubenswrapper[4744]: I1205 20:11:21.940359 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:21Z","lastTransitionTime":"2025-12-05T20:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.042874 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.042913 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.042929 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.042949 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.042961 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:22Z","lastTransitionTime":"2025-12-05T20:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.080670 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:22 crc kubenswrapper[4744]: E1205 20:11:22.080876 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.146522 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.147115 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.147153 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.147180 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.147200 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:22Z","lastTransitionTime":"2025-12-05T20:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.250369 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.250433 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.250452 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.250479 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.250499 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:22Z","lastTransitionTime":"2025-12-05T20:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.353826 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.353889 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.353911 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.353969 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.353991 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:22Z","lastTransitionTime":"2025-12-05T20:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.456930 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.456992 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.457009 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.457035 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.457052 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:22Z","lastTransitionTime":"2025-12-05T20:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.559552 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.559618 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.559641 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.559683 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.559763 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:22Z","lastTransitionTime":"2025-12-05T20:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.663048 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.663112 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.663129 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.663158 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.663177 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:22Z","lastTransitionTime":"2025-12-05T20:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.766780 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.766841 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.766860 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.766884 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.766901 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:22Z","lastTransitionTime":"2025-12-05T20:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.870183 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.870279 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.870328 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.870366 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.870385 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:22Z","lastTransitionTime":"2025-12-05T20:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.961799 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.961863 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.961887 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.961914 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.961934 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:22Z","lastTransitionTime":"2025-12-05T20:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:22 crc kubenswrapper[4744]: E1205 20:11:22.982948 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:22Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.988677 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.988736 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.988755 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.988781 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:22 crc kubenswrapper[4744]: I1205 20:11:22.988799 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:22Z","lastTransitionTime":"2025-12-05T20:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:23 crc kubenswrapper[4744]: E1205 20:11:23.008685 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:23Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.013598 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.013643 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.013660 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.013684 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.013701 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:23Z","lastTransitionTime":"2025-12-05T20:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:23 crc kubenswrapper[4744]: E1205 20:11:23.032705 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:23Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.038287 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.038381 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.038401 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.038421 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.038436 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:23Z","lastTransitionTime":"2025-12-05T20:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:23 crc kubenswrapper[4744]: E1205 20:11:23.059771 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:23Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.064832 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.064882 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.064899 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.064922 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.064941 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:23Z","lastTransitionTime":"2025-12-05T20:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.080123 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.080202 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.080135 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:23 crc kubenswrapper[4744]: E1205 20:11:23.080284 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:23 crc kubenswrapper[4744]: E1205 20:11:23.080427 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:23 crc kubenswrapper[4744]: E1205 20:11:23.080542 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:23 crc kubenswrapper[4744]: E1205 20:11:23.084347 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:23Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:23 crc kubenswrapper[4744]: E1205 20:11:23.084581 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.087079 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.087233 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.087269 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.087345 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.087375 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:23Z","lastTransitionTime":"2025-12-05T20:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.190055 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.190128 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.190143 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.190166 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.190182 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:23Z","lastTransitionTime":"2025-12-05T20:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.293365 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.293408 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.293417 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.293431 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.293441 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:23Z","lastTransitionTime":"2025-12-05T20:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.397389 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.397447 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.397464 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.397488 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.397508 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:23Z","lastTransitionTime":"2025-12-05T20:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.500577 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.500679 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.500698 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.500728 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.500751 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:23Z","lastTransitionTime":"2025-12-05T20:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.603828 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.603893 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.603915 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.603944 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.603962 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:23Z","lastTransitionTime":"2025-12-05T20:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.706256 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.706331 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.706350 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.706375 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.706392 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:23Z","lastTransitionTime":"2025-12-05T20:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.809915 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.809984 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.810000 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.810025 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.810042 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:23Z","lastTransitionTime":"2025-12-05T20:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.913630 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.913668 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.913682 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.913703 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:23 crc kubenswrapper[4744]: I1205 20:11:23.913719 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:23Z","lastTransitionTime":"2025-12-05T20:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.016748 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.016810 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.016829 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.016855 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.016874 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:24Z","lastTransitionTime":"2025-12-05T20:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.079825 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:24 crc kubenswrapper[4744]: E1205 20:11:24.080075 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.120216 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.120314 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.120333 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.120360 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.120419 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:24Z","lastTransitionTime":"2025-12-05T20:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.223440 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.223516 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.223537 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.223575 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.223602 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:24Z","lastTransitionTime":"2025-12-05T20:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.326641 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.326721 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.326746 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.326779 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.326802 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:24Z","lastTransitionTime":"2025-12-05T20:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.430886 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.430960 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.431012 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.431047 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.431074 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:24Z","lastTransitionTime":"2025-12-05T20:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.486914 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.504439 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.514396 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.534335 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.534652 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.534699 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.534731 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.534750 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:24Z","lastTransitionTime":"2025-12-05T20:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.539167 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.562001 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.582433 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.598551 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.618921 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.636233 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.638577 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.638617 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.638634 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.638656 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.638672 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:24Z","lastTransitionTime":"2025-12-05T20:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.653726 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.674635 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.690156 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.709074 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.727807 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.742176 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.742232 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.742246 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.742267 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.742284 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:24Z","lastTransitionTime":"2025-12-05T20:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.761449 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"}]\\\\nI1205 20:11:04.509779 6183 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:11:04.509780 6183 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 20:11:04.509966 6183 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 20:11:04.510009 6183 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:11:04.510032 6183 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 20:11:04.510100 6183 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:17Z\\\",\\\"message\\\":\\\" ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1205 20:11:17.150964 6380 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}\\\\nI1205 20:11:17.150995 6380 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-scheduler-operator for network=default : 9.599849ms\\\\nI1205 20:11:17.150653 6380 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF1205 20:11:17.151004 6380 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.779811 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.797473 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.818588 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.842877 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:24Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.844429 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.844640 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.844758 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.844889 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.845020 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:24Z","lastTransitionTime":"2025-12-05T20:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.948501 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.948577 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.948635 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.948677 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:24 crc kubenswrapper[4744]: I1205 20:11:24.948703 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:24Z","lastTransitionTime":"2025-12-05T20:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.052375 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.052460 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.052487 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.052521 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.052543 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:25Z","lastTransitionTime":"2025-12-05T20:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.079753 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.079762 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.079777 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:25 crc kubenswrapper[4744]: E1205 20:11:25.080286 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:25 crc kubenswrapper[4744]: E1205 20:11:25.080487 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:25 crc kubenswrapper[4744]: E1205 20:11:25.081494 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.155834 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.156184 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.156402 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.156765 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.156981 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:25Z","lastTransitionTime":"2025-12-05T20:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.259628 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.259718 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.259738 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.259770 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.259799 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:25Z","lastTransitionTime":"2025-12-05T20:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.363079 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.363147 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.363165 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.363198 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.363216 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:25Z","lastTransitionTime":"2025-12-05T20:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.467123 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.467203 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.467222 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.467246 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.467264 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:25Z","lastTransitionTime":"2025-12-05T20:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.570770 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.570836 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.570852 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.570878 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.570896 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:25Z","lastTransitionTime":"2025-12-05T20:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.674776 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.674854 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.674874 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.674903 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.674921 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:25Z","lastTransitionTime":"2025-12-05T20:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.778205 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.778331 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.778356 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.778381 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.778398 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:25Z","lastTransitionTime":"2025-12-05T20:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.881789 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.881853 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.881872 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.881897 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.881915 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:25Z","lastTransitionTime":"2025-12-05T20:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.984618 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.984683 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.984702 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.984729 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:25 crc kubenswrapper[4744]: I1205 20:11:25.984750 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:25Z","lastTransitionTime":"2025-12-05T20:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.080228 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:26 crc kubenswrapper[4744]: E1205 20:11:26.080450 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.087885 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.087936 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.087955 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.087977 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.087996 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:26Z","lastTransitionTime":"2025-12-05T20:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.190839 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.190887 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.190909 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.190933 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.190950 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:26Z","lastTransitionTime":"2025-12-05T20:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.293513 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.293570 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.293598 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.293625 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.293648 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:26Z","lastTransitionTime":"2025-12-05T20:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.396787 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.396847 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.396870 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.396899 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.396920 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:26Z","lastTransitionTime":"2025-12-05T20:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.499744 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.499826 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.499840 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.499853 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.499861 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:26Z","lastTransitionTime":"2025-12-05T20:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.602442 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.602479 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.602490 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.602503 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.602513 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:26Z","lastTransitionTime":"2025-12-05T20:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.705791 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.705841 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.705850 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.705872 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.705881 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:26Z","lastTransitionTime":"2025-12-05T20:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.809357 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.809408 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.809422 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.809442 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.809456 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:26Z","lastTransitionTime":"2025-12-05T20:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.912847 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.912924 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.912951 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.912983 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:26 crc kubenswrapper[4744]: I1205 20:11:26.913010 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:26Z","lastTransitionTime":"2025-12-05T20:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.016328 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.016399 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.016418 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.016453 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.016472 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:27Z","lastTransitionTime":"2025-12-05T20:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.079851 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.079851 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:27 crc kubenswrapper[4744]: E1205 20:11:27.080060 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.079886 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:27 crc kubenswrapper[4744]: E1205 20:11:27.080237 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:27 crc kubenswrapper[4744]: E1205 20:11:27.080413 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.119564 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.119625 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.119643 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.119674 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.119691 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:27Z","lastTransitionTime":"2025-12-05T20:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.223082 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.223156 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.223180 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.223208 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.223230 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:27Z","lastTransitionTime":"2025-12-05T20:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.330762 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.330832 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.330851 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.330878 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.330897 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:27Z","lastTransitionTime":"2025-12-05T20:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.434063 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.434127 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.434150 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.434176 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.434197 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:27Z","lastTransitionTime":"2025-12-05T20:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.537346 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.537380 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.537415 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.537432 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.537443 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:27Z","lastTransitionTime":"2025-12-05T20:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.640485 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.640813 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.640979 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.641141 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.641447 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:27Z","lastTransitionTime":"2025-12-05T20:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.744386 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.744426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.744439 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.744455 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.744465 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:27Z","lastTransitionTime":"2025-12-05T20:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.848556 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.848585 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.848594 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.848606 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.848615 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:27Z","lastTransitionTime":"2025-12-05T20:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.950986 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.951040 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.951054 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.951074 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:27 crc kubenswrapper[4744]: I1205 20:11:27.951086 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:27Z","lastTransitionTime":"2025-12-05T20:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.053563 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.053606 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.053618 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.053634 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.053645 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:28Z","lastTransitionTime":"2025-12-05T20:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.080574 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:28 crc kubenswrapper[4744]: E1205 20:11:28.080799 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.156876 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.156915 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.156928 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.156947 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.156958 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:28Z","lastTransitionTime":"2025-12-05T20:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.260341 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.260370 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.260378 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.260391 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.260399 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:28Z","lastTransitionTime":"2025-12-05T20:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.362396 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.362454 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.362473 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.362498 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.362515 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:28Z","lastTransitionTime":"2025-12-05T20:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.465093 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.465143 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.465181 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.465211 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.465233 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:28Z","lastTransitionTime":"2025-12-05T20:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.568092 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.568161 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.568181 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.568207 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.568224 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:28Z","lastTransitionTime":"2025-12-05T20:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.671950 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.672031 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.672067 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.672098 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.672121 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:28Z","lastTransitionTime":"2025-12-05T20:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.775146 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.775216 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.775234 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.775258 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.775275 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:28Z","lastTransitionTime":"2025-12-05T20:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.878668 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.878725 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.878744 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.878767 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.878785 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:28Z","lastTransitionTime":"2025-12-05T20:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.981721 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.981792 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.981805 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.981825 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:28 crc kubenswrapper[4744]: I1205 20:11:28.981840 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:28Z","lastTransitionTime":"2025-12-05T20:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.080340 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.080380 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.080341 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:29 crc kubenswrapper[4744]: E1205 20:11:29.080552 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:29 crc kubenswrapper[4744]: E1205 20:11:29.080764 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:29 crc kubenswrapper[4744]: E1205 20:11:29.080941 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.084575 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.084627 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.084645 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.084669 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.084687 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:29Z","lastTransitionTime":"2025-12-05T20:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.187552 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.187640 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.187665 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.187696 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.187719 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:29Z","lastTransitionTime":"2025-12-05T20:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.291392 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.291457 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.291475 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.291502 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.291522 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:29Z","lastTransitionTime":"2025-12-05T20:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.394587 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.394650 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.394667 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.394690 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.394708 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:29Z","lastTransitionTime":"2025-12-05T20:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.498557 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.498634 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.498659 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.498691 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.498715 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:29Z","lastTransitionTime":"2025-12-05T20:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.601677 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.601769 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.601789 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.601812 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.601897 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:29Z","lastTransitionTime":"2025-12-05T20:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.705581 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.705655 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.705672 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.705696 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.705713 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:29Z","lastTransitionTime":"2025-12-05T20:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.808997 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.809554 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.809772 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.810441 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.810686 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:29Z","lastTransitionTime":"2025-12-05T20:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.913863 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.914182 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.914454 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.914667 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:29 crc kubenswrapper[4744]: I1205 20:11:29.914873 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:29Z","lastTransitionTime":"2025-12-05T20:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.019514 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.019581 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.019598 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.019625 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.019642 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:30Z","lastTransitionTime":"2025-12-05T20:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.080778 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:30 crc kubenswrapper[4744]: E1205 20:11:30.081039 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.096051 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5846a3-5e6a-41aa-9760-c3cbd1ae2435\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3cb0c6029e9d18a57a79c23494dfa0c9f0edb458067341b7edd7f172d15f49f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a563f5bbd35e353c4f1763fdc0d084cd4bc94f57fb048205dd02dcadbac4e74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://712fb551ba5ca5e933dcb56b5d5d89d892320c9e52da2d46da7e19133939ef79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.116184 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.122827 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.122869 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.122885 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.122908 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.122924 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:30Z","lastTransitionTime":"2025-12-05T20:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.135359 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.159433 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.191732 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.213463 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.226194 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.226250 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.226269 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.226318 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.226337 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:30Z","lastTransitionTime":"2025-12-05T20:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.232672 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.251793 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.265128 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.281689 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.294781 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.310876 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.329764 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.330008 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.330095 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.330118 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.330148 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.330173 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:30Z","lastTransitionTime":"2025-12-05T20:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.347728 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.361474 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.376601 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.409786 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e74c1d11a1393b33e2fb8aa1da416c60d753fd28451af2af629783691706896\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"}]\\\\nI1205 20:11:04.509779 6183 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/package-server-manager-metrics]} name:Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 20:11:04.509780 6183 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 20:11:04.509966 6183 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 20:11:04.510009 6183 ovnkube.go:599] Stopped ovnkube\\\\nI1205 20:11:04.510032 6183 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 20:11:04.510100 6183 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:17Z\\\",\\\"message\\\":\\\" ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1205 20:11:17.150964 6380 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}\\\\nI1205 20:11:17.150995 6380 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-scheduler-operator for network=default : 9.599849ms\\\\nI1205 20:11:17.150653 6380 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF1205 20:11:17.151004 6380 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.423593 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:30Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.432137 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.432191 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.432218 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.432250 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.432274 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:30Z","lastTransitionTime":"2025-12-05T20:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.535474 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.535540 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.535559 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.535588 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.535606 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:30Z","lastTransitionTime":"2025-12-05T20:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.638186 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.638253 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.638283 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.638334 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.638352 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:30Z","lastTransitionTime":"2025-12-05T20:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.744405 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.744463 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.744481 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.744509 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.744526 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:30Z","lastTransitionTime":"2025-12-05T20:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.848140 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.848204 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.848226 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.848254 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.848275 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:30Z","lastTransitionTime":"2025-12-05T20:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.951929 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.952024 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.952044 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.952080 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:30 crc kubenswrapper[4744]: I1205 20:11:30.952102 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:30Z","lastTransitionTime":"2025-12-05T20:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.055112 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.055458 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.055674 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.055906 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.056113 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:31Z","lastTransitionTime":"2025-12-05T20:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.079697 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.079763 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.079697 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:31 crc kubenswrapper[4744]: E1205 20:11:31.079902 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:31 crc kubenswrapper[4744]: E1205 20:11:31.080048 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:31 crc kubenswrapper[4744]: E1205 20:11:31.080151 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.160058 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.160143 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.160165 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.160197 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.160217 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:31Z","lastTransitionTime":"2025-12-05T20:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.263618 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.263686 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.263704 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.263731 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.263749 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:31Z","lastTransitionTime":"2025-12-05T20:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.366263 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.366406 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.366431 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.366465 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.366487 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:31Z","lastTransitionTime":"2025-12-05T20:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.469710 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.469777 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.469797 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.469824 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.469850 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:31Z","lastTransitionTime":"2025-12-05T20:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.573377 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.573449 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.573477 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.573501 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.573521 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:31Z","lastTransitionTime":"2025-12-05T20:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.676381 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.676439 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.676457 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.676481 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.676499 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:31Z","lastTransitionTime":"2025-12-05T20:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.793142 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.793208 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.793225 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.793250 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.793266 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:31Z","lastTransitionTime":"2025-12-05T20:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.896739 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.896801 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.896817 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.896845 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:31 crc kubenswrapper[4744]: I1205 20:11:31.896863 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:31Z","lastTransitionTime":"2025-12-05T20:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.000886 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.000941 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.000958 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.000983 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.001001 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:32Z","lastTransitionTime":"2025-12-05T20:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.083561 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:32 crc kubenswrapper[4744]: E1205 20:11:32.083798 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.104120 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.104175 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.104260 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.104288 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.104382 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:32Z","lastTransitionTime":"2025-12-05T20:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.206477 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.206563 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.206575 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.206590 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.206601 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:32Z","lastTransitionTime":"2025-12-05T20:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.309661 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.309707 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.309723 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.309752 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.309787 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:32Z","lastTransitionTime":"2025-12-05T20:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.412786 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.412829 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.412846 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.412868 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.412886 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:32Z","lastTransitionTime":"2025-12-05T20:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.515962 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.516006 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.516024 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.516048 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.516064 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:32Z","lastTransitionTime":"2025-12-05T20:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.619658 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.619723 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.619742 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.619767 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.619786 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:32Z","lastTransitionTime":"2025-12-05T20:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.722171 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.722241 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.722258 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.722282 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.722330 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:32Z","lastTransitionTime":"2025-12-05T20:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.824873 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.824922 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.824933 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.824951 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.824966 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:32Z","lastTransitionTime":"2025-12-05T20:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.928644 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.928713 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.928729 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.928756 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:32 crc kubenswrapper[4744]: I1205 20:11:32.928774 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:32Z","lastTransitionTime":"2025-12-05T20:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.032365 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.032440 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.032463 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.032494 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.032517 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:33Z","lastTransitionTime":"2025-12-05T20:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.080762 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.080800 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:33 crc kubenswrapper[4744]: E1205 20:11:33.081082 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:33 crc kubenswrapper[4744]: E1205 20:11:33.081155 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.081752 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:33 crc kubenswrapper[4744]: E1205 20:11:33.081938 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.082611 4744 scope.go:117] "RemoveContainer" containerID="6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245" Dec 05 20:11:33 crc kubenswrapper[4744]: E1205 20:11:33.082947 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.101978 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.118682 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.135110 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.135935 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.135992 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.136012 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.136040 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.136064 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:33Z","lastTransitionTime":"2025-12-05T20:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.152976 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.183633 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:17Z\\\",\\\"message\\\":\\\" ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1205 20:11:17.150964 6380 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}\\\\nI1205 20:11:17.150995 6380 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-scheduler-operator for network=default : 9.599849ms\\\\nI1205 20:11:17.150653 6380 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF1205 20:11:17.151004 6380 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.195768 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.195831 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.195856 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.195888 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.195912 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:33Z","lastTransitionTime":"2025-12-05T20:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.201881 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: E1205 20:11:33.217546 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.222376 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.222427 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.222445 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.222469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.222488 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:33Z","lastTransitionTime":"2025-12-05T20:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.223252 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.240567 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: E1205 20:11:33.244281 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.250169 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.250208 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.250225 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.250246 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.250263 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:33Z","lastTransitionTime":"2025-12-05T20:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.255508 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: E1205 20:11:33.269319 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.273027 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.273456 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.273489 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.273506 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.273527 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.273542 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:33Z","lastTransitionTime":"2025-12-05T20:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:33 crc kubenswrapper[4744]: E1205 20:11:33.286420 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.290538 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.290589 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.290606 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.290635 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.290653 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:33Z","lastTransitionTime":"2025-12-05T20:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.295400 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: E1205 20:11:33.308918 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: E1205 20:11:33.309141 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.310849 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.310899 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.310917 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.310942 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.310962 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:33Z","lastTransitionTime":"2025-12-05T20:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.313420 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5846a3-5e6a-41aa-9760-c3cbd1ae2435\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3cb0c6029e9d18a57a79c23494dfa0c9f0edb458067341b7edd7f172d15f49f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a563f5bbd35e353c4f1763fdc0d084cd4bc94f57fb048205dd02dcadbac4e74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://712fb551ba5ca5e933dcb56b5d5d89d892320c9e52da2d46da7e19133939ef79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.327375 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.342984 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.359436 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.384362 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.404376 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.420075 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:33Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.420138 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.420259 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.420283 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.420364 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.420393 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:33Z","lastTransitionTime":"2025-12-05T20:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.522984 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.523031 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.523042 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.523060 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.523072 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:33Z","lastTransitionTime":"2025-12-05T20:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.625686 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.625736 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.625754 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.625777 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.625794 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:33Z","lastTransitionTime":"2025-12-05T20:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.728892 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.728956 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.728975 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.728999 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.729016 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:33Z","lastTransitionTime":"2025-12-05T20:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.832998 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.833058 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.833068 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.833081 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.833089 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:33Z","lastTransitionTime":"2025-12-05T20:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.935680 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.935734 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.935751 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.935773 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:33 crc kubenswrapper[4744]: I1205 20:11:33.935790 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:33Z","lastTransitionTime":"2025-12-05T20:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.039209 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.039267 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.039285 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.039341 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.039359 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:34Z","lastTransitionTime":"2025-12-05T20:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.080707 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:34 crc kubenswrapper[4744]: E1205 20:11:34.080935 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.141729 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.141771 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.141787 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.141809 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.141825 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:34Z","lastTransitionTime":"2025-12-05T20:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.245339 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.245372 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.245384 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.245400 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.245410 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:34Z","lastTransitionTime":"2025-12-05T20:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.348909 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.348969 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.348985 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.349003 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.349013 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:34Z","lastTransitionTime":"2025-12-05T20:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.450839 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.450880 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.450888 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.450904 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.450913 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:34Z","lastTransitionTime":"2025-12-05T20:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.553766 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.553827 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.553848 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.553872 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.553888 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:34Z","lastTransitionTime":"2025-12-05T20:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.656445 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.656500 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.656509 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.656524 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.656533 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:34Z","lastTransitionTime":"2025-12-05T20:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.758914 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.758952 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.758962 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.758978 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.758988 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:34Z","lastTransitionTime":"2025-12-05T20:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.861498 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.861545 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.861559 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.861575 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.861587 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:34Z","lastTransitionTime":"2025-12-05T20:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.965382 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.965417 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.965426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.965439 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:34 crc kubenswrapper[4744]: I1205 20:11:34.965448 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:34Z","lastTransitionTime":"2025-12-05T20:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.067686 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.067765 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.067783 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.067811 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.067831 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:35Z","lastTransitionTime":"2025-12-05T20:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.080384 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.080424 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.080447 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:35 crc kubenswrapper[4744]: E1205 20:11:35.080524 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:35 crc kubenswrapper[4744]: E1205 20:11:35.080666 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:35 crc kubenswrapper[4744]: E1205 20:11:35.080779 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.170665 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.170713 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.170728 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.170746 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.170757 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:35Z","lastTransitionTime":"2025-12-05T20:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.274595 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.274639 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.274656 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.274678 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.274697 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:35Z","lastTransitionTime":"2025-12-05T20:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.377793 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.377860 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.377876 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.377901 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.377922 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:35Z","lastTransitionTime":"2025-12-05T20:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.480822 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.480865 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.480883 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.480906 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.480923 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:35Z","lastTransitionTime":"2025-12-05T20:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.584707 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.584792 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.584818 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.584850 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.584870 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:35Z","lastTransitionTime":"2025-12-05T20:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.686659 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.686719 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.686737 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.686762 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.686781 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:35Z","lastTransitionTime":"2025-12-05T20:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.789612 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.789675 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.789693 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.789720 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.789737 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:35Z","lastTransitionTime":"2025-12-05T20:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.893467 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.893530 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.893549 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.893576 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.893593 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:35Z","lastTransitionTime":"2025-12-05T20:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.996727 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.996791 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.996813 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.996836 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:35 crc kubenswrapper[4744]: I1205 20:11:35.996855 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:35Z","lastTransitionTime":"2025-12-05T20:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.080520 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:36 crc kubenswrapper[4744]: E1205 20:11:36.080755 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.100286 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.100384 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.100425 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.100485 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.100512 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:36Z","lastTransitionTime":"2025-12-05T20:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.100836 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.206168 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.206209 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.206224 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.206242 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.206256 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:36Z","lastTransitionTime":"2025-12-05T20:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.309119 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.309188 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.309200 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.309241 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.309256 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:36Z","lastTransitionTime":"2025-12-05T20:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.411670 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.411941 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.412026 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.412106 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.412186 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:36Z","lastTransitionTime":"2025-12-05T20:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.515262 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.515342 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.515363 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.515386 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.515404 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:36Z","lastTransitionTime":"2025-12-05T20:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.617857 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.617931 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.617952 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.617978 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.617999 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:36Z","lastTransitionTime":"2025-12-05T20:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.720275 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.720629 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.720761 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.720883 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.721005 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:36Z","lastTransitionTime":"2025-12-05T20:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.807343 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs\") pod \"network-metrics-daemon-cgjbb\" (UID: \"9d0c84c8-b581-47ce-8cb8-956d3ef79238\") " pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:36 crc kubenswrapper[4744]: E1205 20:11:36.807521 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:11:36 crc kubenswrapper[4744]: E1205 20:11:36.807589 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs podName:9d0c84c8-b581-47ce-8cb8-956d3ef79238 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:08.807568156 +0000 UTC m=+99.037379544 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs") pod "network-metrics-daemon-cgjbb" (UID: "9d0c84c8-b581-47ce-8cb8-956d3ef79238") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.823144 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.823181 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.823192 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.823209 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.823219 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:36Z","lastTransitionTime":"2025-12-05T20:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.925981 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.926012 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.926023 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.926038 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:36 crc kubenswrapper[4744]: I1205 20:11:36.926046 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:36Z","lastTransitionTime":"2025-12-05T20:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.029022 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.029056 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.029065 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.029078 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.029088 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:37Z","lastTransitionTime":"2025-12-05T20:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.080189 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.080219 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:37 crc kubenswrapper[4744]: E1205 20:11:37.080328 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.080186 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:37 crc kubenswrapper[4744]: E1205 20:11:37.080419 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:37 crc kubenswrapper[4744]: E1205 20:11:37.080512 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.131487 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.131520 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.131530 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.131544 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.131554 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:37Z","lastTransitionTime":"2025-12-05T20:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.234537 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.234577 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.234590 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.234607 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.234619 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:37Z","lastTransitionTime":"2025-12-05T20:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.337113 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.337151 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.337162 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.337177 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.337190 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:37Z","lastTransitionTime":"2025-12-05T20:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.439483 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.439551 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.439572 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.439600 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.439623 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:37Z","lastTransitionTime":"2025-12-05T20:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.549180 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.549595 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.549621 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.549646 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.549663 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:37Z","lastTransitionTime":"2025-12-05T20:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.606155 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7qlm7_89bdeba9-f644-4465-a9f8-82c682f6aea3/kube-multus/0.log" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.606204 4744 generic.go:334] "Generic (PLEG): container finished" podID="89bdeba9-f644-4465-a9f8-82c682f6aea3" containerID="07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547" exitCode=1 Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.606236 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7qlm7" event={"ID":"89bdeba9-f644-4465-a9f8-82c682f6aea3","Type":"ContainerDied","Data":"07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547"} Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.606620 4744 scope.go:117] "RemoveContainer" containerID="07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.623243 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.633098 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a0d066c-089c-42c0-9ae0-480eb0ad2449\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2202785bde164d4a280e7869c6fcea433591d861b9ebceeb441f42e7a44552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.645691 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5846a3-5e6a-41aa-9760-c3cbd1ae2435\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3cb0c6029e9d18a57a79c23494dfa0c9f0edb458067341b7edd7f172d15f49f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a563f5bbd35e353c4f1763fdc0d084cd4bc94f57fb048205dd02dcadbac4e74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://712fb551ba5ca5e933dcb56b5d5d89d892320c9e52da2d46da7e19133939ef79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.652188 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.652213 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.652223 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.652241 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.652254 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:37Z","lastTransitionTime":"2025-12-05T20:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.660822 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.675561 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:37Z\\\",\\\"message\\\":\\\"2025-12-05T20:10:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a\\\\n2025-12-05T20:10:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a to /host/opt/cni/bin/\\\\n2025-12-05T20:10:52Z [verbose] multus-daemon started\\\\n2025-12-05T20:10:52Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:11:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.686829 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.709161 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.724354 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.739284 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.753941 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.753972 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.753982 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.753999 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.754045 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:37Z","lastTransitionTime":"2025-12-05T20:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.758722 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.772665 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.784843 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.796458 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.817614 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:17Z\\\",\\\"message\\\":\\\" ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1205 20:11:17.150964 6380 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}\\\\nI1205 20:11:17.150995 6380 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-scheduler-operator for network=default : 9.599849ms\\\\nI1205 20:11:17.150653 6380 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF1205 20:11:17.151004 6380 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.830273 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.841773 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.852331 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.855990 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.856029 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.856041 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.856062 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.856074 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:37Z","lastTransitionTime":"2025-12-05T20:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.864206 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.874427 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:37Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.958897 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.958937 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.958945 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.958959 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:37 crc kubenswrapper[4744]: I1205 20:11:37.958971 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:37Z","lastTransitionTime":"2025-12-05T20:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.061257 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.061283 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.061307 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.061319 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.061327 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:38Z","lastTransitionTime":"2025-12-05T20:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.080643 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:38 crc kubenswrapper[4744]: E1205 20:11:38.080834 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.163396 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.163427 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.163438 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.163451 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.163460 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:38Z","lastTransitionTime":"2025-12-05T20:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.265919 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.265973 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.265983 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.265995 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.266004 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:38Z","lastTransitionTime":"2025-12-05T20:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.368329 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.368380 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.368392 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.368409 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.368421 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:38Z","lastTransitionTime":"2025-12-05T20:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.470508 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.470921 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.471067 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.471224 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.471403 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:38Z","lastTransitionTime":"2025-12-05T20:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.574252 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.574322 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.574333 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.574352 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.574363 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:38Z","lastTransitionTime":"2025-12-05T20:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.610867 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7qlm7_89bdeba9-f644-4465-a9f8-82c682f6aea3/kube-multus/0.log" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.610914 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7qlm7" event={"ID":"89bdeba9-f644-4465-a9f8-82c682f6aea3","Type":"ContainerStarted","Data":"6ab97d51a3279ce570cf3560d86cc5052f5e9bbd25e84afcca05bcce623fc34c"} Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.630115 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.646176 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.659625 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.676438 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.676483 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.676495 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.676511 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.676522 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:38Z","lastTransitionTime":"2025-12-05T20:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.679462 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.691232 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.705884 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.717266 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.734800 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:17Z\\\",\\\"message\\\":\\\" ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1205 20:11:17.150964 6380 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}\\\\nI1205 20:11:17.150995 6380 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-scheduler-operator for network=default : 9.599849ms\\\\nI1205 20:11:17.150653 6380 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF1205 20:11:17.151004 6380 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.748398 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.760445 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a0d066c-089c-42c0-9ae0-480eb0ad2449\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2202785bde164d4a280e7869c6fcea433591d861b9ebceeb441f42e7a44552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.771931 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5846a3-5e6a-41aa-9760-c3cbd1ae2435\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3cb0c6029e9d18a57a79c23494dfa0c9f0edb458067341b7edd7f172d15f49f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a563f5bbd35e353c4f1763fdc0d084cd4bc94f57fb048205dd02dcadbac4e74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://712fb551ba5ca5e933dcb56b5d5d89d892320c9e52da2d46da7e19133939ef79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.778682 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.778717 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.778728 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.778745 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.778755 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:38Z","lastTransitionTime":"2025-12-05T20:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.788519 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.813540 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab97d51a3279ce570cf3560d86cc5052f5e9bbd25e84afcca05bcce623fc34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:37Z\\\",\\\"message\\\":\\\"2025-12-05T20:10:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a\\\\n2025-12-05T20:10:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a to /host/opt/cni/bin/\\\\n2025-12-05T20:10:52Z [verbose] multus-daemon started\\\\n2025-12-05T20:10:52Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:11:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.829722 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.854823 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.871063 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.882492 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.882531 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.882540 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.882553 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.882562 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:38Z","lastTransitionTime":"2025-12-05T20:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.890450 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.907906 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.920099 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:38Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.985212 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.985268 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.985325 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.985359 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:38 crc kubenswrapper[4744]: I1205 20:11:38.985390 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:38Z","lastTransitionTime":"2025-12-05T20:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.080230 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.080249 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:39 crc kubenswrapper[4744]: E1205 20:11:39.080398 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.080599 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:39 crc kubenswrapper[4744]: E1205 20:11:39.080633 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:39 crc kubenswrapper[4744]: E1205 20:11:39.080873 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.088667 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.088728 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.088746 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.088771 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.088789 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:39Z","lastTransitionTime":"2025-12-05T20:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.191598 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.191637 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.191648 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.191661 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.191671 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:39Z","lastTransitionTime":"2025-12-05T20:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.293893 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.293948 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.293966 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.293990 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.294004 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:39Z","lastTransitionTime":"2025-12-05T20:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.396468 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.396524 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.396547 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.396575 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.396597 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:39Z","lastTransitionTime":"2025-12-05T20:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.498771 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.498810 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.498821 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.498837 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.498846 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:39Z","lastTransitionTime":"2025-12-05T20:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.601585 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.601665 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.601690 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.601719 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.601744 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:39Z","lastTransitionTime":"2025-12-05T20:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.704442 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.704501 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.704519 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.704541 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.704560 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:39Z","lastTransitionTime":"2025-12-05T20:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.806616 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.806690 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.806709 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.806737 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.806755 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:39Z","lastTransitionTime":"2025-12-05T20:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.908823 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.908866 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.908878 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.908897 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:39 crc kubenswrapper[4744]: I1205 20:11:39.908908 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:39Z","lastTransitionTime":"2025-12-05T20:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.011277 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.011388 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.011407 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.011430 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.011448 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:40Z","lastTransitionTime":"2025-12-05T20:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.079813 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:40 crc kubenswrapper[4744]: E1205 20:11:40.079955 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.105271 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.113831 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.113920 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.113940 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.113966 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.114016 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:40Z","lastTransitionTime":"2025-12-05T20:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.121553 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.134472 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.147546 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.161186 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.176010 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.187922 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.200556 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.214635 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.219889 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.219945 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.219963 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.219982 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.219998 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:40Z","lastTransitionTime":"2025-12-05T20:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.234560 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.250356 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.263847 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.281555 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:17Z\\\",\\\"message\\\":\\\" ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1205 20:11:17.150964 6380 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}\\\\nI1205 20:11:17.150995 6380 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-scheduler-operator for network=default : 9.599849ms\\\\nI1205 20:11:17.150653 6380 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF1205 20:11:17.151004 6380 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.297463 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.308602 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a0d066c-089c-42c0-9ae0-480eb0ad2449\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2202785bde164d4a280e7869c6fcea433591d861b9ebceeb441f42e7a44552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.322072 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.322113 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.322127 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.322146 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.322158 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:40Z","lastTransitionTime":"2025-12-05T20:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.322118 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5846a3-5e6a-41aa-9760-c3cbd1ae2435\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3cb0c6029e9d18a57a79c23494dfa0c9f0edb458067341b7edd7f172d15f49f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a563f5bbd35e353c4f1763fdc0d084cd4bc94f57fb048205dd02dcadbac4e74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://712fb551ba5ca5e933dcb56b5d5d89d892320c9e52da2d46da7e19133939ef79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.340064 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.355114 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab97d51a3279ce570cf3560d86cc5052f5e9bbd25e84afcca05bcce623fc34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:37Z\\\",\\\"message\\\":\\\"2025-12-05T20:10:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a\\\\n2025-12-05T20:10:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a to /host/opt/cni/bin/\\\\n2025-12-05T20:10:52Z [verbose] multus-daemon started\\\\n2025-12-05T20:10:52Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:11:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.371437 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:40Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.424882 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.424931 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.424944 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.424960 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.425314 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:40Z","lastTransitionTime":"2025-12-05T20:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.527882 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.527917 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.527928 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.527944 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.527956 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:40Z","lastTransitionTime":"2025-12-05T20:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.630027 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.630087 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.630105 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.630128 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.630145 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:40Z","lastTransitionTime":"2025-12-05T20:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.732556 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.732602 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.732611 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.732625 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.732635 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:40Z","lastTransitionTime":"2025-12-05T20:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.835567 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.835604 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.835635 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.835652 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.835668 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:40Z","lastTransitionTime":"2025-12-05T20:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.939316 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.939366 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.939416 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.939441 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:40 crc kubenswrapper[4744]: I1205 20:11:40.939457 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:40Z","lastTransitionTime":"2025-12-05T20:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.041784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.041840 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.041855 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.041872 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.041886 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:41Z","lastTransitionTime":"2025-12-05T20:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.080492 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:41 crc kubenswrapper[4744]: E1205 20:11:41.080598 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.080621 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.080662 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:41 crc kubenswrapper[4744]: E1205 20:11:41.080725 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:41 crc kubenswrapper[4744]: E1205 20:11:41.080781 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.144160 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.144234 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.144253 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.144278 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.144323 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:41Z","lastTransitionTime":"2025-12-05T20:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.246733 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.246823 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.246848 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.246880 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.246908 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:41Z","lastTransitionTime":"2025-12-05T20:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.349584 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.349645 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.349663 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.349686 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.349703 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:41Z","lastTransitionTime":"2025-12-05T20:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.453040 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.453077 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.453086 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.453099 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.453108 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:41Z","lastTransitionTime":"2025-12-05T20:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.555176 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.555215 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.555225 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.555238 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.555248 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:41Z","lastTransitionTime":"2025-12-05T20:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.657161 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.657360 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.657442 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.657504 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.657564 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:41Z","lastTransitionTime":"2025-12-05T20:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.760364 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.760427 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.760444 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.760469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.760493 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:41Z","lastTransitionTime":"2025-12-05T20:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.863040 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.863068 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.863079 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.863091 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.863101 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:41Z","lastTransitionTime":"2025-12-05T20:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.965706 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.965739 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.965749 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.965761 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:41 crc kubenswrapper[4744]: I1205 20:11:41.965771 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:41Z","lastTransitionTime":"2025-12-05T20:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.068955 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.069017 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.069031 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.069050 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.069063 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:42Z","lastTransitionTime":"2025-12-05T20:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.080429 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:42 crc kubenswrapper[4744]: E1205 20:11:42.080569 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.171692 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.171731 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.171740 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.171754 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.171764 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:42Z","lastTransitionTime":"2025-12-05T20:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.274321 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.274387 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.274405 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.274430 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.274449 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:42Z","lastTransitionTime":"2025-12-05T20:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.377426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.377474 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.377484 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.377501 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.377512 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:42Z","lastTransitionTime":"2025-12-05T20:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.479868 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.479922 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.479939 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.479962 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.479979 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:42Z","lastTransitionTime":"2025-12-05T20:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.582971 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.583041 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.583062 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.583086 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.583102 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:42Z","lastTransitionTime":"2025-12-05T20:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.685416 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.685474 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.685492 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.685515 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.685533 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:42Z","lastTransitionTime":"2025-12-05T20:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.792186 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.792252 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.792273 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.792343 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.792368 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:42Z","lastTransitionTime":"2025-12-05T20:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.895328 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.895387 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.895411 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.895440 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.895463 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:42Z","lastTransitionTime":"2025-12-05T20:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.998130 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.998193 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.998209 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.998234 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:42 crc kubenswrapper[4744]: I1205 20:11:42.998253 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:42Z","lastTransitionTime":"2025-12-05T20:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.080338 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.080398 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.080436 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:43 crc kubenswrapper[4744]: E1205 20:11:43.080544 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:43 crc kubenswrapper[4744]: E1205 20:11:43.080693 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:43 crc kubenswrapper[4744]: E1205 20:11:43.080823 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.100654 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.100711 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.100735 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.100760 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.100779 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:43Z","lastTransitionTime":"2025-12-05T20:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.204237 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.204415 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.204441 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.204473 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.204492 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:43Z","lastTransitionTime":"2025-12-05T20:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.306959 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.307000 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.307012 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.307036 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.307052 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:43Z","lastTransitionTime":"2025-12-05T20:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.410672 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.410730 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.410748 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.410770 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.410789 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:43Z","lastTransitionTime":"2025-12-05T20:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.513525 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.513599 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.513621 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.513651 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.513673 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:43Z","lastTransitionTime":"2025-12-05T20:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.515230 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.515373 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.515403 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.515428 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.515447 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:43Z","lastTransitionTime":"2025-12-05T20:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:43 crc kubenswrapper[4744]: E1205 20:11:43.539071 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:43Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.545270 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.545343 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.545360 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.545385 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.545401 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:43Z","lastTransitionTime":"2025-12-05T20:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:43 crc kubenswrapper[4744]: E1205 20:11:43.562986 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:43Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.569099 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.569140 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.569156 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.569178 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.569194 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:43Z","lastTransitionTime":"2025-12-05T20:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:43 crc kubenswrapper[4744]: E1205 20:11:43.590321 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:43Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.595332 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.595421 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.595440 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.595522 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.595548 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:43Z","lastTransitionTime":"2025-12-05T20:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:43 crc kubenswrapper[4744]: E1205 20:11:43.621264 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:43Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.626921 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.626965 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.626983 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.627004 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.627020 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:43Z","lastTransitionTime":"2025-12-05T20:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:43 crc kubenswrapper[4744]: E1205 20:11:43.650087 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:43Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:43 crc kubenswrapper[4744]: E1205 20:11:43.650356 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.652507 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.652565 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.652588 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.652616 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.652635 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:43Z","lastTransitionTime":"2025-12-05T20:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.755691 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.755749 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.755765 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.755788 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.755804 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:43Z","lastTransitionTime":"2025-12-05T20:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.859143 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.859238 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.859263 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.859386 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.859418 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:43Z","lastTransitionTime":"2025-12-05T20:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.962468 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.962559 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.962597 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.962626 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:43 crc kubenswrapper[4744]: I1205 20:11:43.962647 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:43Z","lastTransitionTime":"2025-12-05T20:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.066151 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.066219 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.066242 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.066270 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.066323 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:44Z","lastTransitionTime":"2025-12-05T20:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.080261 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:44 crc kubenswrapper[4744]: E1205 20:11:44.080886 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.081264 4744 scope.go:117] "RemoveContainer" containerID="6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.169021 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.169079 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.169097 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.169121 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.169138 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:44Z","lastTransitionTime":"2025-12-05T20:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.272893 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.272956 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.272973 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.272997 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.273015 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:44Z","lastTransitionTime":"2025-12-05T20:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.376949 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.377033 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.377058 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.377091 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.377109 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:44Z","lastTransitionTime":"2025-12-05T20:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.479757 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.479816 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.479834 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.479858 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.479875 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:44Z","lastTransitionTime":"2025-12-05T20:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.583319 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.583382 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.583401 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.583424 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.583442 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:44Z","lastTransitionTime":"2025-12-05T20:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.686678 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.686740 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.686757 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.686779 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.686798 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:44Z","lastTransitionTime":"2025-12-05T20:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.789242 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.789345 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.789373 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.789401 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.789423 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:44Z","lastTransitionTime":"2025-12-05T20:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.892328 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.892368 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.892380 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.892397 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.892409 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:44Z","lastTransitionTime":"2025-12-05T20:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.995679 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.995727 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.995744 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.995766 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:44 crc kubenswrapper[4744]: I1205 20:11:44.995781 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:44Z","lastTransitionTime":"2025-12-05T20:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.080493 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:45 crc kubenswrapper[4744]: E1205 20:11:45.080648 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.080933 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:45 crc kubenswrapper[4744]: E1205 20:11:45.081021 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.081205 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:45 crc kubenswrapper[4744]: E1205 20:11:45.081316 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.097726 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.097756 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.097767 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.097781 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.097793 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:45Z","lastTransitionTime":"2025-12-05T20:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.200273 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.200344 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.200363 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.200386 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.200403 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:45Z","lastTransitionTime":"2025-12-05T20:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.302938 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.302991 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.303007 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.303029 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.303044 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:45Z","lastTransitionTime":"2025-12-05T20:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.406177 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.406230 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.406248 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.406270 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.406287 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:45Z","lastTransitionTime":"2025-12-05T20:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.508671 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.508903 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.509024 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.509174 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.509284 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:45Z","lastTransitionTime":"2025-12-05T20:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.612668 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.613229 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.613367 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.613492 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.613607 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:45Z","lastTransitionTime":"2025-12-05T20:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.638996 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovnkube-controller/2.log" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.643355 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerStarted","Data":"c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9"} Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.643941 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.668226 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.684523 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a0d066c-089c-42c0-9ae0-480eb0ad2449\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2202785bde164d4a280e7869c6fcea433591d861b9ebceeb441f42e7a44552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.703828 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5846a3-5e6a-41aa-9760-c3cbd1ae2435\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3cb0c6029e9d18a57a79c23494dfa0c9f0edb458067341b7edd7f172d15f49f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a563f5bbd35e353c4f1763fdc0d084cd4bc94f57fb048205dd02dcadbac4e74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://712fb551ba5ca5e933dcb56b5d5d89d892320c9e52da2d46da7e19133939ef79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.716942 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.717202 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.717507 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.717710 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.717890 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:45Z","lastTransitionTime":"2025-12-05T20:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.726052 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.747347 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab97d51a3279ce570cf3560d86cc5052f5e9bbd25e84afcca05bcce623fc34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:37Z\\\",\\\"message\\\":\\\"2025-12-05T20:10:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a\\\\n2025-12-05T20:10:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a to /host/opt/cni/bin/\\\\n2025-12-05T20:10:52Z [verbose] multus-daemon started\\\\n2025-12-05T20:10:52Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:11:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.760690 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.785481 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.806830 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.820641 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.820809 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.820938 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.821062 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.821178 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:45Z","lastTransitionTime":"2025-12-05T20:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.822426 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.838329 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.850787 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.861430 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.874926 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.897894 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:17Z\\\",\\\"message\\\":\\\" ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1205 20:11:17.150964 6380 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}\\\\nI1205 20:11:17.150995 6380 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-scheduler-operator for network=default : 9.599849ms\\\\nI1205 20:11:17.150653 6380 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF1205 20:11:17.151004 6380 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.913181 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.923735 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.923932 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.924064 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.924199 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.924479 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:45Z","lastTransitionTime":"2025-12-05T20:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.930799 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.940987 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.958510 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:45 crc kubenswrapper[4744]: I1205 20:11:45.970921 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:45Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.028420 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.028498 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.028515 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.028542 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.028563 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:46Z","lastTransitionTime":"2025-12-05T20:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.080691 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:46 crc kubenswrapper[4744]: E1205 20:11:46.080837 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.131952 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.132027 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.132039 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.132057 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.132071 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:46Z","lastTransitionTime":"2025-12-05T20:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.234779 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.234828 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.234840 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.234859 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.234874 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:46Z","lastTransitionTime":"2025-12-05T20:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.337981 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.338045 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.338067 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.338096 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.338118 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:46Z","lastTransitionTime":"2025-12-05T20:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.442626 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.442717 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.442737 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.442763 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.442780 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:46Z","lastTransitionTime":"2025-12-05T20:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.545944 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.546063 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.546080 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.546103 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.546131 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:46Z","lastTransitionTime":"2025-12-05T20:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.648872 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovnkube-controller/3.log" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.649179 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.649201 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.649211 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.649226 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.649239 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:46Z","lastTransitionTime":"2025-12-05T20:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.649874 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovnkube-controller/2.log" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.653747 4744 generic.go:334] "Generic (PLEG): container finished" podID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerID="c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9" exitCode=1 Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.653798 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerDied","Data":"c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9"} Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.653841 4744 scope.go:117] "RemoveContainer" containerID="6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.654768 4744 scope.go:117] "RemoveContainer" containerID="c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9" Dec 05 20:11:46 crc kubenswrapper[4744]: E1205 20:11:46.655005 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.672737 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a0d066c-089c-42c0-9ae0-480eb0ad2449\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2202785bde164d4a280e7869c6fcea433591d861b9ebceeb441f42e7a44552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.690889 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5846a3-5e6a-41aa-9760-c3cbd1ae2435\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3cb0c6029e9d18a57a79c23494dfa0c9f0edb458067341b7edd7f172d15f49f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a563f5bbd35e353c4f1763fdc0d084cd4bc94f57fb048205dd02dcadbac4e74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://712fb551ba5ca5e933dcb56b5d5d89d892320c9e52da2d46da7e19133939ef79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.709667 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.729206 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab97d51a3279ce570cf3560d86cc5052f5e9bbd25e84afcca05bcce623fc34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:37Z\\\",\\\"message\\\":\\\"2025-12-05T20:10:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a\\\\n2025-12-05T20:10:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a to /host/opt/cni/bin/\\\\n2025-12-05T20:10:52Z [verbose] multus-daemon started\\\\n2025-12-05T20:10:52Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:11:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.750367 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.752093 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.752122 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.752135 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.752242 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.752315 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:46Z","lastTransitionTime":"2025-12-05T20:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.782381 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.803351 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.821930 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.839980 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.852875 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.856069 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.856107 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.856121 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.856138 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.856149 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:46Z","lastTransitionTime":"2025-12-05T20:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.865954 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.877356 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.892547 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.911711 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.928255 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.942369 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.957016 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.959691 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.959942 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.959967 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.960069 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.960088 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:46Z","lastTransitionTime":"2025-12-05T20:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:46 crc kubenswrapper[4744]: I1205 20:11:46.987049 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f97b933c3f591dc129ef131b083f8f98650bd041c8fff060abc302e6b0cf245\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:17Z\\\",\\\"message\\\":\\\" ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1205 20:11:17.150964 6380 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}\\\\nI1205 20:11:17.150995 6380 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-scheduler-operator for network=default : 9.599849ms\\\\nI1205 20:11:17.150653 6380 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF1205 20:11:17.151004 6380 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:45Z\\\",\\\"message\\\":\\\"lversions/factory.go:140\\\\nI1205 20:11:45.545630 6745 factory.go:656] Stopping watch factory\\\\nI1205 20:11:45.545646 6745 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:11:45.545654 6745 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:11:45.545543 6745 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:11:45.545679 6745 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:11:45.545687 6745 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:11:45.545693 6745 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:11:45.545878 6745 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:11:45.546194 6745 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 20:11:45.546368 6745 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:46Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.002992 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.063953 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.064027 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.064041 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.064069 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.064089 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:47Z","lastTransitionTime":"2025-12-05T20:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.080438 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.080500 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.080641 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:47 crc kubenswrapper[4744]: E1205 20:11:47.080643 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:47 crc kubenswrapper[4744]: E1205 20:11:47.080777 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:47 crc kubenswrapper[4744]: E1205 20:11:47.080923 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.167138 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.167200 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.167224 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.167245 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.167260 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:47Z","lastTransitionTime":"2025-12-05T20:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.270029 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.270077 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.270093 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.270113 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.270126 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:47Z","lastTransitionTime":"2025-12-05T20:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.372700 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.372750 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.372765 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.372785 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.372799 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:47Z","lastTransitionTime":"2025-12-05T20:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.475225 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.475260 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.475272 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.475322 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.475339 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:47Z","lastTransitionTime":"2025-12-05T20:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.578341 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.578390 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.578406 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.578427 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.578443 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:47Z","lastTransitionTime":"2025-12-05T20:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.659412 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovnkube-controller/3.log" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.666658 4744 scope.go:117] "RemoveContainer" containerID="c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9" Dec 05 20:11:47 crc kubenswrapper[4744]: E1205 20:11:47.666867 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.681380 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.681421 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.681433 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.681453 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.681466 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:47Z","lastTransitionTime":"2025-12-05T20:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.684206 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.699744 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.713073 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.730364 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.747773 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.758755 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.771138 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.781988 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.784008 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.784036 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.784047 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.784063 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.784076 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:47Z","lastTransitionTime":"2025-12-05T20:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.802033 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:45Z\\\",\\\"message\\\":\\\"lversions/factory.go:140\\\\nI1205 20:11:45.545630 6745 factory.go:656] Stopping watch factory\\\\nI1205 20:11:45.545646 6745 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:11:45.545654 6745 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:11:45.545543 6745 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:11:45.545679 6745 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:11:45.545687 6745 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:11:45.545693 6745 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:11:45.545878 6745 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:11:45.546194 6745 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 20:11:45.546368 6745 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.813446 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a0d066c-089c-42c0-9ae0-480eb0ad2449\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2202785bde164d4a280e7869c6fcea433591d861b9ebceeb441f42e7a44552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.827980 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5846a3-5e6a-41aa-9760-c3cbd1ae2435\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3cb0c6029e9d18a57a79c23494dfa0c9f0edb458067341b7edd7f172d15f49f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a563f5bbd35e353c4f1763fdc0d084cd4bc94f57fb048205dd02dcadbac4e74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://712fb551ba5ca5e933dcb56b5d5d89d892320c9e52da2d46da7e19133939ef79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.842360 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.856512 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab97d51a3279ce570cf3560d86cc5052f5e9bbd25e84afcca05bcce623fc34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:37Z\\\",\\\"message\\\":\\\"2025-12-05T20:10:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a\\\\n2025-12-05T20:10:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a to /host/opt/cni/bin/\\\\n2025-12-05T20:10:52Z [verbose] multus-daemon started\\\\n2025-12-05T20:10:52Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:11:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.873223 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.886834 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.886871 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.886884 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.886900 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.886911 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:47Z","lastTransitionTime":"2025-12-05T20:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.895765 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.917411 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.930807 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.947522 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.959916 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:47Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.989730 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.989782 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.989800 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.989824 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:47 crc kubenswrapper[4744]: I1205 20:11:47.989842 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:47Z","lastTransitionTime":"2025-12-05T20:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.079808 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:48 crc kubenswrapper[4744]: E1205 20:11:48.080005 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.092186 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.092228 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.092245 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.092263 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.092275 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:48Z","lastTransitionTime":"2025-12-05T20:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.194964 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.195016 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.195029 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.195046 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.195059 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:48Z","lastTransitionTime":"2025-12-05T20:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.297715 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.297791 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.297815 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.297840 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.297860 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:48Z","lastTransitionTime":"2025-12-05T20:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.400322 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.400381 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.400396 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.400416 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.400429 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:48Z","lastTransitionTime":"2025-12-05T20:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.503560 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.503646 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.503698 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.503726 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.503747 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:48Z","lastTransitionTime":"2025-12-05T20:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.606469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.606511 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.606522 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.606538 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.606550 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:48Z","lastTransitionTime":"2025-12-05T20:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.710199 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.710273 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.710327 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.710353 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.710371 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:48Z","lastTransitionTime":"2025-12-05T20:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.813749 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.814110 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.814129 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.814152 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.814171 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:48Z","lastTransitionTime":"2025-12-05T20:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.917068 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.917109 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.917120 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.917138 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:48 crc kubenswrapper[4744]: I1205 20:11:48.917149 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:48Z","lastTransitionTime":"2025-12-05T20:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.020407 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.020471 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.020490 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.020513 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.020529 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:49Z","lastTransitionTime":"2025-12-05T20:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.080337 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.080379 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.080474 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:49 crc kubenswrapper[4744]: E1205 20:11:49.080512 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:49 crc kubenswrapper[4744]: E1205 20:11:49.080645 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:49 crc kubenswrapper[4744]: E1205 20:11:49.080782 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.123185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.123242 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.123261 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.123285 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.123334 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:49Z","lastTransitionTime":"2025-12-05T20:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.225967 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.226025 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.226036 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.226054 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.226066 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:49Z","lastTransitionTime":"2025-12-05T20:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.329267 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.329337 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.329349 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.329366 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.329378 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:49Z","lastTransitionTime":"2025-12-05T20:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.432022 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.432064 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.432077 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.432097 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.432113 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:49Z","lastTransitionTime":"2025-12-05T20:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.535490 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.535551 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.535573 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.535601 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.535621 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:49Z","lastTransitionTime":"2025-12-05T20:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.638680 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.638716 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.638726 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.638741 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.638752 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:49Z","lastTransitionTime":"2025-12-05T20:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.741134 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.741162 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.741171 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.741202 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.741210 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:49Z","lastTransitionTime":"2025-12-05T20:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.843185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.843212 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.843220 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.843232 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.843241 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:49Z","lastTransitionTime":"2025-12-05T20:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.945778 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.945860 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.945884 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.945915 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:49 crc kubenswrapper[4744]: I1205 20:11:49.945935 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:49Z","lastTransitionTime":"2025-12-05T20:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.048891 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.048963 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.049011 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.049035 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.049051 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:50Z","lastTransitionTime":"2025-12-05T20:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.080760 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:50 crc kubenswrapper[4744]: E1205 20:11:50.081049 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.094861 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.106125 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a0d066c-089c-42c0-9ae0-480eb0ad2449\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2202785bde164d4a280e7869c6fcea433591d861b9ebceeb441f42e7a44552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.120143 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5846a3-5e6a-41aa-9760-c3cbd1ae2435\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3cb0c6029e9d18a57a79c23494dfa0c9f0edb458067341b7edd7f172d15f49f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a563f5bbd35e353c4f1763fdc0d084cd4bc94f57fb048205dd02dcadbac4e74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://712fb551ba5ca5e933dcb56b5d5d89d892320c9e52da2d46da7e19133939ef79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.132883 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.145965 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab97d51a3279ce570cf3560d86cc5052f5e9bbd25e84afcca05bcce623fc34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:37Z\\\",\\\"message\\\":\\\"2025-12-05T20:10:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a\\\\n2025-12-05T20:10:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a to /host/opt/cni/bin/\\\\n2025-12-05T20:10:52Z [verbose] multus-daemon started\\\\n2025-12-05T20:10:52Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:11:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.151513 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.151556 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.151568 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.151585 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.151598 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:50Z","lastTransitionTime":"2025-12-05T20:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.157349 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.182187 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.198722 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.212219 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.225988 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.239181 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.249164 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.253492 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.253522 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.253532 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.253568 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.253578 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:50Z","lastTransitionTime":"2025-12-05T20:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.258025 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.303341 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:45Z\\\",\\\"message\\\":\\\"lversions/factory.go:140\\\\nI1205 20:11:45.545630 6745 factory.go:656] Stopping watch factory\\\\nI1205 20:11:45.545646 6745 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:11:45.545654 6745 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:11:45.545543 6745 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:11:45.545679 6745 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:11:45.545687 6745 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:11:45.545693 6745 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:11:45.545878 6745 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:11:45.546194 6745 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 20:11:45.546368 6745 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.316103 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.327337 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.339417 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.350825 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.356388 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.356425 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.356435 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.356448 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.356461 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:50Z","lastTransitionTime":"2025-12-05T20:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.363091 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:50Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.458454 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.458497 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.458506 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.458520 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.458531 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:50Z","lastTransitionTime":"2025-12-05T20:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.561711 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.561771 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.561788 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.561813 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.561831 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:50Z","lastTransitionTime":"2025-12-05T20:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.663941 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.664007 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.664030 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.664057 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.664073 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:50Z","lastTransitionTime":"2025-12-05T20:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.767461 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.767519 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.767537 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.767561 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.767578 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:50Z","lastTransitionTime":"2025-12-05T20:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.870394 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.870442 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.870458 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.870479 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.870495 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:50Z","lastTransitionTime":"2025-12-05T20:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.973261 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.973393 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.973423 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.973452 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:50 crc kubenswrapper[4744]: I1205 20:11:50.973474 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:50Z","lastTransitionTime":"2025-12-05T20:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.076746 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.076864 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.076923 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.076951 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.076968 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:51Z","lastTransitionTime":"2025-12-05T20:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.080668 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.080710 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:51 crc kubenswrapper[4744]: E1205 20:11:51.080798 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.080945 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:51 crc kubenswrapper[4744]: E1205 20:11:51.080953 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:51 crc kubenswrapper[4744]: E1205 20:11:51.081005 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.180653 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.180722 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.180746 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.180784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.180824 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:51Z","lastTransitionTime":"2025-12-05T20:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.283432 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.283476 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.283486 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.283499 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.283510 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:51Z","lastTransitionTime":"2025-12-05T20:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.385865 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.386119 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.386138 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.386161 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.386178 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:51Z","lastTransitionTime":"2025-12-05T20:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.489238 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.489321 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.489347 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.489396 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.489418 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:51Z","lastTransitionTime":"2025-12-05T20:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.592420 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.592463 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.592475 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.592491 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.592503 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:51Z","lastTransitionTime":"2025-12-05T20:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.694711 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.694775 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.694796 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.694823 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.694844 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:51Z","lastTransitionTime":"2025-12-05T20:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.797709 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.797750 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.797762 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.797779 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.797791 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:51Z","lastTransitionTime":"2025-12-05T20:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.900189 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.900261 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.900288 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.900358 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:51 crc kubenswrapper[4744]: I1205 20:11:51.900384 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:51Z","lastTransitionTime":"2025-12-05T20:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.002373 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.002433 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.002451 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.002475 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.002493 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:52Z","lastTransitionTime":"2025-12-05T20:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.080216 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:52 crc kubenswrapper[4744]: E1205 20:11:52.080408 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.104769 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.104851 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.104866 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.104880 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.104922 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:52Z","lastTransitionTime":"2025-12-05T20:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.208184 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.208233 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.208247 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.208265 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.208280 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:52Z","lastTransitionTime":"2025-12-05T20:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.311048 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.311088 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.311102 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.311118 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.311129 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:52Z","lastTransitionTime":"2025-12-05T20:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.413975 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.414016 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.414028 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.414044 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.414056 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:52Z","lastTransitionTime":"2025-12-05T20:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.517073 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.517125 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.517139 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.517156 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.517168 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:52Z","lastTransitionTime":"2025-12-05T20:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.586820 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.587000 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.587060 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:52 crc kubenswrapper[4744]: E1205 20:11:52.587209 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:11:52 crc kubenswrapper[4744]: E1205 20:11:52.587221 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:11:52 crc kubenswrapper[4744]: E1205 20:11:52.587353 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.587326855 +0000 UTC m=+146.817138263 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 20:11:52 crc kubenswrapper[4744]: E1205 20:11:52.587495 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.587363646 +0000 UTC m=+146.817175024 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:11:52 crc kubenswrapper[4744]: E1205 20:11:52.587624 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.587576431 +0000 UTC m=+146.817387809 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.619973 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.620033 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.620050 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.620074 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.620092 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:52Z","lastTransitionTime":"2025-12-05T20:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.688175 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.688249 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:52 crc kubenswrapper[4744]: E1205 20:11:52.688818 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:11:52 crc kubenswrapper[4744]: E1205 20:11:52.688883 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:11:52 crc kubenswrapper[4744]: E1205 20:11:52.688911 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:11:52 crc kubenswrapper[4744]: E1205 20:11:52.688831 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 20:11:52 crc kubenswrapper[4744]: E1205 20:11:52.689072 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.689007078 +0000 UTC m=+146.918818486 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:11:52 crc kubenswrapper[4744]: E1205 20:11:52.689090 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 20:11:52 crc kubenswrapper[4744]: E1205 20:11:52.689186 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:11:52 crc kubenswrapper[4744]: E1205 20:11:52.689351 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.689276375 +0000 UTC m=+146.919087783 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.722275 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.722351 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.722363 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.722380 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.722415 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:52Z","lastTransitionTime":"2025-12-05T20:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.824823 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.824880 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.824897 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.824919 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.824936 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:52Z","lastTransitionTime":"2025-12-05T20:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.928318 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.928365 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.928376 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.928392 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:52 crc kubenswrapper[4744]: I1205 20:11:52.928403 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:52Z","lastTransitionTime":"2025-12-05T20:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.031810 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.031861 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.031877 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.031899 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.031915 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:53Z","lastTransitionTime":"2025-12-05T20:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.080642 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.080642 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:53 crc kubenswrapper[4744]: E1205 20:11:53.080839 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:53 crc kubenswrapper[4744]: E1205 20:11:53.080931 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.080677 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:53 crc kubenswrapper[4744]: E1205 20:11:53.081037 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.134986 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.135057 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.135080 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.135108 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.135127 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:53Z","lastTransitionTime":"2025-12-05T20:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.237064 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.237122 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.237140 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.237165 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.237181 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:53Z","lastTransitionTime":"2025-12-05T20:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.339798 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.339846 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.339864 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.339884 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.339897 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:53Z","lastTransitionTime":"2025-12-05T20:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.442423 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.442476 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.442492 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.442513 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.442531 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:53Z","lastTransitionTime":"2025-12-05T20:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.545994 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.546059 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.546073 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.546094 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.546109 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:53Z","lastTransitionTime":"2025-12-05T20:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.649157 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.649189 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.649200 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.649215 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.649226 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:53Z","lastTransitionTime":"2025-12-05T20:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.751981 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.752029 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.752070 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.752093 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.752108 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:53Z","lastTransitionTime":"2025-12-05T20:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.759439 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.759514 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.759526 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.759538 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.759548 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:53Z","lastTransitionTime":"2025-12-05T20:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:53 crc kubenswrapper[4744]: E1205 20:11:53.776829 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.780760 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.780833 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.780859 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.780890 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.780910 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:53Z","lastTransitionTime":"2025-12-05T20:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:53 crc kubenswrapper[4744]: E1205 20:11:53.797390 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.802370 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.802428 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.802452 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.802481 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.802507 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:53Z","lastTransitionTime":"2025-12-05T20:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:53 crc kubenswrapper[4744]: E1205 20:11:53.817521 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.822537 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.822577 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.822590 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.822607 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.822621 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:53Z","lastTransitionTime":"2025-12-05T20:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:53 crc kubenswrapper[4744]: E1205 20:11:53.838095 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.842051 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.842088 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.842099 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.842114 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.842126 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:53Z","lastTransitionTime":"2025-12-05T20:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:53 crc kubenswrapper[4744]: E1205 20:11:53.856227 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:11:53Z is after 2025-08-24T17:21:41Z" Dec 05 20:11:53 crc kubenswrapper[4744]: E1205 20:11:53.856496 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.858069 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.858115 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.858134 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.858157 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.858174 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:53Z","lastTransitionTime":"2025-12-05T20:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.961052 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.961088 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.961099 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.961115 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:53 crc kubenswrapper[4744]: I1205 20:11:53.961128 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:53Z","lastTransitionTime":"2025-12-05T20:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.064414 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.064749 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.064767 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.064784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.064827 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:54Z","lastTransitionTime":"2025-12-05T20:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.080628 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:54 crc kubenswrapper[4744]: E1205 20:11:54.080819 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.168359 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.168429 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.168440 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.168456 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.168469 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:54Z","lastTransitionTime":"2025-12-05T20:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.271045 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.271095 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.271109 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.271130 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.271148 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:54Z","lastTransitionTime":"2025-12-05T20:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.373557 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.373592 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.373602 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.373618 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.373629 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:54Z","lastTransitionTime":"2025-12-05T20:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.476112 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.476148 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.476161 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.476176 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.476198 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:54Z","lastTransitionTime":"2025-12-05T20:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.578777 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.578823 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.578877 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.578901 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.578916 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:54Z","lastTransitionTime":"2025-12-05T20:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.682526 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.682588 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.682608 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.682630 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.682648 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:54Z","lastTransitionTime":"2025-12-05T20:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.785716 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.785775 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.785787 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.785807 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.785820 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:54Z","lastTransitionTime":"2025-12-05T20:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.888838 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.888900 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.888922 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.888987 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.889011 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:54Z","lastTransitionTime":"2025-12-05T20:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.990986 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.991028 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.991040 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.991057 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:54 crc kubenswrapper[4744]: I1205 20:11:54.991070 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:54Z","lastTransitionTime":"2025-12-05T20:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.080560 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.080633 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:55 crc kubenswrapper[4744]: E1205 20:11:55.080680 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:55 crc kubenswrapper[4744]: E1205 20:11:55.080773 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.080839 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:55 crc kubenswrapper[4744]: E1205 20:11:55.080893 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.094347 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.094425 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.094444 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.094461 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.094473 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:55Z","lastTransitionTime":"2025-12-05T20:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.196722 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.196795 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.196812 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.196841 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.196867 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:55Z","lastTransitionTime":"2025-12-05T20:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.299615 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.299662 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.299675 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.299690 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.299704 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:55Z","lastTransitionTime":"2025-12-05T20:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.402829 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.402889 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.402904 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.402927 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.402942 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:55Z","lastTransitionTime":"2025-12-05T20:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.506445 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.506500 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.506516 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.506539 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.506557 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:55Z","lastTransitionTime":"2025-12-05T20:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.609514 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.609581 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.609592 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.609607 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.609618 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:55Z","lastTransitionTime":"2025-12-05T20:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.712451 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.712527 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.712549 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.712580 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.712604 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:55Z","lastTransitionTime":"2025-12-05T20:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.815680 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.815749 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.815773 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.815802 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.815823 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:55Z","lastTransitionTime":"2025-12-05T20:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.919560 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.919628 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.919648 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.919680 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:55 crc kubenswrapper[4744]: I1205 20:11:55.919705 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:55Z","lastTransitionTime":"2025-12-05T20:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.022850 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.022937 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.022961 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.022989 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.023011 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:56Z","lastTransitionTime":"2025-12-05T20:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.080652 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:56 crc kubenswrapper[4744]: E1205 20:11:56.081057 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.125344 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.125449 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.125469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.125527 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.125550 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:56Z","lastTransitionTime":"2025-12-05T20:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.228669 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.228729 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.228746 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.228768 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.228785 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:56Z","lastTransitionTime":"2025-12-05T20:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.331633 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.331715 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.331754 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.331785 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.331807 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:56Z","lastTransitionTime":"2025-12-05T20:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.434642 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.434703 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.434718 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.434740 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.434760 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:56Z","lastTransitionTime":"2025-12-05T20:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.537211 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.537288 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.537335 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.537360 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.537377 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:56Z","lastTransitionTime":"2025-12-05T20:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.639901 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.639941 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.639952 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.639966 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.639977 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:56Z","lastTransitionTime":"2025-12-05T20:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.742855 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.742886 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.742894 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.742908 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.742916 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:56Z","lastTransitionTime":"2025-12-05T20:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.845010 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.845052 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.845060 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.845074 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.845083 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:56Z","lastTransitionTime":"2025-12-05T20:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.948568 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.948651 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.948674 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.948704 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:56 crc kubenswrapper[4744]: I1205 20:11:56.948726 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:56Z","lastTransitionTime":"2025-12-05T20:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.051191 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.051241 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.051255 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.051273 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.051285 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:57Z","lastTransitionTime":"2025-12-05T20:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.079866 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.079895 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.079915 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:57 crc kubenswrapper[4744]: E1205 20:11:57.080028 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:57 crc kubenswrapper[4744]: E1205 20:11:57.080137 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:57 crc kubenswrapper[4744]: E1205 20:11:57.080243 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.153107 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.153142 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.153152 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.153164 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.153173 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:57Z","lastTransitionTime":"2025-12-05T20:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.255550 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.255618 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.255640 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.255669 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.255692 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:57Z","lastTransitionTime":"2025-12-05T20:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.359164 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.359215 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.359230 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.359247 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.359259 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:57Z","lastTransitionTime":"2025-12-05T20:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.461704 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.461768 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.461786 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.461827 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.461844 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:57Z","lastTransitionTime":"2025-12-05T20:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.565457 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.565514 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.565535 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.565556 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.565576 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:57Z","lastTransitionTime":"2025-12-05T20:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.668949 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.669028 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.669046 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.669068 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.669084 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:57Z","lastTransitionTime":"2025-12-05T20:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.772418 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.772476 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.772494 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.772519 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.772541 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:57Z","lastTransitionTime":"2025-12-05T20:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.875943 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.875995 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.876007 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.876023 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.876036 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:57Z","lastTransitionTime":"2025-12-05T20:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.979590 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.979670 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.979692 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.979724 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:57 crc kubenswrapper[4744]: I1205 20:11:57.979748 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:57Z","lastTransitionTime":"2025-12-05T20:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.079883 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:11:58 crc kubenswrapper[4744]: E1205 20:11:58.080734 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.083630 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.083675 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.083689 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.083708 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.083721 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:58Z","lastTransitionTime":"2025-12-05T20:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.186923 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.186990 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.187002 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.187024 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.187038 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:58Z","lastTransitionTime":"2025-12-05T20:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.290178 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.290247 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.290265 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.290320 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.290339 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:58Z","lastTransitionTime":"2025-12-05T20:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.394008 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.394093 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.394122 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.394147 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.394171 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:58Z","lastTransitionTime":"2025-12-05T20:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.498150 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.498232 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.498255 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.498286 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.498361 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:58Z","lastTransitionTime":"2025-12-05T20:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.602069 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.602134 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.602156 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.602187 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.602208 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:58Z","lastTransitionTime":"2025-12-05T20:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.705071 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.705129 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.705145 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.705169 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.705187 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:58Z","lastTransitionTime":"2025-12-05T20:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.808343 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.808411 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.808430 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.808453 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.808472 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:58Z","lastTransitionTime":"2025-12-05T20:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.911780 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.911857 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.911881 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.911912 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:58 crc kubenswrapper[4744]: I1205 20:11:58.911930 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:58Z","lastTransitionTime":"2025-12-05T20:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.015211 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.015287 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.015347 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.015375 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.015397 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:59Z","lastTransitionTime":"2025-12-05T20:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.079648 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.079688 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.079763 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:11:59 crc kubenswrapper[4744]: E1205 20:11:59.079837 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:11:59 crc kubenswrapper[4744]: E1205 20:11:59.079971 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:11:59 crc kubenswrapper[4744]: E1205 20:11:59.080120 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.118765 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.118815 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.118835 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.118860 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.118877 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:59Z","lastTransitionTime":"2025-12-05T20:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.221794 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.221852 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.221869 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.221892 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.221908 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:59Z","lastTransitionTime":"2025-12-05T20:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.325546 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.325663 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.325681 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.325706 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.325724 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:59Z","lastTransitionTime":"2025-12-05T20:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.428867 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.428938 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.428957 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.428980 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.428995 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:59Z","lastTransitionTime":"2025-12-05T20:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.532707 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.532791 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.532816 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.532853 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.532880 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:59Z","lastTransitionTime":"2025-12-05T20:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.636232 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.636339 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.636363 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.636387 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.636407 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:59Z","lastTransitionTime":"2025-12-05T20:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.739896 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.739952 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.739968 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.739990 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.740007 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:59Z","lastTransitionTime":"2025-12-05T20:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.843399 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.843469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.843486 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.843510 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.843527 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:59Z","lastTransitionTime":"2025-12-05T20:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.946175 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.946258 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.946282 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.946351 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:11:59 crc kubenswrapper[4744]: I1205 20:11:59.946376 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:11:59Z","lastTransitionTime":"2025-12-05T20:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.049990 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.050099 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.050118 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.050186 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.050210 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:00Z","lastTransitionTime":"2025-12-05T20:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.079862 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:00 crc kubenswrapper[4744]: E1205 20:12:00.080279 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.081456 4744 scope.go:117] "RemoveContainer" containerID="c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9" Dec 05 20:12:00 crc kubenswrapper[4744]: E1205 20:12:00.081744 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.101842 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.120383 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.140466 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.153451 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.153552 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.153573 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.153639 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.153659 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:00Z","lastTransitionTime":"2025-12-05T20:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.159479 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.197980 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:45Z\\\",\\\"message\\\":\\\"lversions/factory.go:140\\\\nI1205 20:11:45.545630 6745 factory.go:656] Stopping watch factory\\\\nI1205 20:11:45.545646 6745 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:11:45.545654 6745 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:11:45.545543 6745 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:11:45.545679 6745 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:11:45.545687 6745 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:11:45.545693 6745 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:11:45.545878 6745 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:11:45.546194 6745 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 20:11:45.546368 6745 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.215152 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.229966 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a0d066c-089c-42c0-9ae0-480eb0ad2449\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2202785bde164d4a280e7869c6fcea433591d861b9ebceeb441f42e7a44552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.250173 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5846a3-5e6a-41aa-9760-c3cbd1ae2435\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3cb0c6029e9d18a57a79c23494dfa0c9f0edb458067341b7edd7f172d15f49f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a563f5bbd35e353c4f1763fdc0d084cd4bc94f57fb048205dd02dcadbac4e74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://712fb551ba5ca5e933dcb56b5d5d89d892320c9e52da2d46da7e19133939ef79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.256673 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.256745 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.256765 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.256791 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.256813 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:00Z","lastTransitionTime":"2025-12-05T20:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.271375 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.292768 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab97d51a3279ce570cf3560d86cc5052f5e9bbd25e84afcca05bcce623fc34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:37Z\\\",\\\"message\\\":\\\"2025-12-05T20:10:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a\\\\n2025-12-05T20:10:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a to /host/opt/cni/bin/\\\\n2025-12-05T20:10:52Z [verbose] multus-daemon started\\\\n2025-12-05T20:10:52Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:11:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.317642 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.350785 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.361813 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.361875 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.361895 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.361919 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.361937 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:00Z","lastTransitionTime":"2025-12-05T20:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.374055 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.393602 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.415084 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.433057 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.455121 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.465469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.465521 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.465534 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.465554 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.465569 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:00Z","lastTransitionTime":"2025-12-05T20:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.471390 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.487074 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:00Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.569205 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.569286 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.569342 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.569371 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.569395 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:00Z","lastTransitionTime":"2025-12-05T20:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.671877 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.671920 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.671931 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.671946 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.671956 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:00Z","lastTransitionTime":"2025-12-05T20:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.774178 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.774216 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.774227 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.774242 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.774254 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:00Z","lastTransitionTime":"2025-12-05T20:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.877282 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.877349 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.877368 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.877391 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.877401 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:00Z","lastTransitionTime":"2025-12-05T20:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.979790 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.979864 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.979889 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.979916 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:00 crc kubenswrapper[4744]: I1205 20:12:00.979938 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:00Z","lastTransitionTime":"2025-12-05T20:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.080357 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.080506 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.080898 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:01 crc kubenswrapper[4744]: E1205 20:12:01.081081 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:01 crc kubenswrapper[4744]: E1205 20:12:01.081213 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:01 crc kubenswrapper[4744]: E1205 20:12:01.081571 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.082849 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.082896 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.082919 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.082948 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.082970 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:01Z","lastTransitionTime":"2025-12-05T20:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.185921 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.185993 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.186015 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.186033 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.186045 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:01Z","lastTransitionTime":"2025-12-05T20:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.289109 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.289166 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.289184 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.289207 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.289227 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:01Z","lastTransitionTime":"2025-12-05T20:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.391833 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.391879 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.391889 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.391905 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.391916 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:01Z","lastTransitionTime":"2025-12-05T20:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.494426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.494480 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.494495 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.494513 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.494527 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:01Z","lastTransitionTime":"2025-12-05T20:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.597454 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.597511 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.597529 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.597553 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.597572 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:01Z","lastTransitionTime":"2025-12-05T20:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.700750 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.700811 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.700828 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.700851 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.700869 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:01Z","lastTransitionTime":"2025-12-05T20:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.804141 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.804216 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.804235 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.804259 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.804274 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:01Z","lastTransitionTime":"2025-12-05T20:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.907670 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.907721 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.907737 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.907762 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:01 crc kubenswrapper[4744]: I1205 20:12:01.907780 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:01Z","lastTransitionTime":"2025-12-05T20:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.010988 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.011059 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.011078 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.011104 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.011126 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:02Z","lastTransitionTime":"2025-12-05T20:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.079994 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:02 crc kubenswrapper[4744]: E1205 20:12:02.080195 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.114463 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.114519 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.114539 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.114562 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.114578 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:02Z","lastTransitionTime":"2025-12-05T20:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.217584 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.217654 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.217674 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.217700 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.217718 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:02Z","lastTransitionTime":"2025-12-05T20:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.320726 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.320766 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.320778 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.320795 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.320806 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:02Z","lastTransitionTime":"2025-12-05T20:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.424133 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.424212 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.424233 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.424263 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.424286 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:02Z","lastTransitionTime":"2025-12-05T20:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.527702 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.527766 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.527784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.527807 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.527823 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:02Z","lastTransitionTime":"2025-12-05T20:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.630143 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.630192 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.630204 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.630223 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.630238 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:02Z","lastTransitionTime":"2025-12-05T20:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.733011 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.733058 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.733070 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.733086 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.733100 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:02Z","lastTransitionTime":"2025-12-05T20:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.835984 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.836069 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.836089 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.836114 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.836133 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:02Z","lastTransitionTime":"2025-12-05T20:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.939100 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.939155 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.939173 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.939201 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:02 crc kubenswrapper[4744]: I1205 20:12:02.939254 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:02Z","lastTransitionTime":"2025-12-05T20:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.041669 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.041843 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.041872 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.041954 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.042042 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:03Z","lastTransitionTime":"2025-12-05T20:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.080663 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.080696 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.080666 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:03 crc kubenswrapper[4744]: E1205 20:12:03.080840 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:03 crc kubenswrapper[4744]: E1205 20:12:03.080939 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:03 crc kubenswrapper[4744]: E1205 20:12:03.081173 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.145066 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.145134 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.145152 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.145176 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.145193 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:03Z","lastTransitionTime":"2025-12-05T20:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.248097 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.248137 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.248148 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.248162 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.248170 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:03Z","lastTransitionTime":"2025-12-05T20:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.350767 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.350845 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.350867 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.350891 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.350913 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:03Z","lastTransitionTime":"2025-12-05T20:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.454728 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.454804 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.454820 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.454845 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.454863 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:03Z","lastTransitionTime":"2025-12-05T20:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.558182 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.558237 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.558255 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.558280 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.558344 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:03Z","lastTransitionTime":"2025-12-05T20:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.661821 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.661897 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.661920 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.661949 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.661973 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:03Z","lastTransitionTime":"2025-12-05T20:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.765075 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.765149 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.765167 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.765192 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.765209 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:03Z","lastTransitionTime":"2025-12-05T20:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.868334 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.868390 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.868407 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.868429 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.868446 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:03Z","lastTransitionTime":"2025-12-05T20:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.971267 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.971376 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.971395 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.971421 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:03 crc kubenswrapper[4744]: I1205 20:12:03.971438 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:03Z","lastTransitionTime":"2025-12-05T20:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.073820 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.073905 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.073926 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.073950 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.073966 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.080423 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:04 crc kubenswrapper[4744]: E1205 20:12:04.080605 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.112036 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.112120 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.112131 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.112149 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.112163 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4744]: E1205 20:12:04.133693 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.139193 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.139261 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.139288 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.139381 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.139402 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4744]: E1205 20:12:04.160044 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.165698 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.165746 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.165763 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.165789 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.165808 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4744]: E1205 20:12:04.187374 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.192732 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.192780 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.192796 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.192822 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.192839 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4744]: E1205 20:12:04.212574 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.217814 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.217911 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.217931 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.217954 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.217972 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4744]: E1205 20:12:04.237897 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T20:12:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19d8c788-01c0-4af7-b075-d7b6a1f1aadc\\\",\\\"systemUUID\\\":\\\"81235d50-4058-490a-b9b8-3ea7ecb9321c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:04Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:04 crc kubenswrapper[4744]: E1205 20:12:04.238138 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.240228 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.240323 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.240343 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.240368 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.240385 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.343249 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.343323 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.343340 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.343362 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.343379 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.445628 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.445699 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.445722 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.445750 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.445768 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.548548 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.548617 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.548639 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.548667 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.548689 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.651718 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.651784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.651801 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.651822 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.651839 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.754682 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.754745 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.754784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.754811 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.754830 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.859151 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.859212 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.859229 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.859251 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.859270 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.962284 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.962384 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.962402 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.962425 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:04 crc kubenswrapper[4744]: I1205 20:12:04.962441 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:04Z","lastTransitionTime":"2025-12-05T20:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.065954 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.066020 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.066037 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.066060 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.066078 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.080360 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.080407 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.080381 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:05 crc kubenswrapper[4744]: E1205 20:12:05.080656 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:05 crc kubenswrapper[4744]: E1205 20:12:05.080784 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:05 crc kubenswrapper[4744]: E1205 20:12:05.081005 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.168622 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.168683 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.168700 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.168725 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.168742 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.272241 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.272342 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.272368 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.272395 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.272414 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.375338 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.375379 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.375388 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.375405 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.375417 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.477819 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.477888 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.477923 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.477955 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.477998 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.581760 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.581818 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.581835 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.581857 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.581875 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.683852 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.683895 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.683908 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.683925 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.683934 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.786168 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.786246 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.786267 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.786337 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.786363 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.889252 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.889346 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.889366 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.889419 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.889440 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.992966 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.993028 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.993052 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.993078 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:05 crc kubenswrapper[4744]: I1205 20:12:05.993097 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:05Z","lastTransitionTime":"2025-12-05T20:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.080578 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:06 crc kubenswrapper[4744]: E1205 20:12:06.081121 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.095651 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.095674 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.095683 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.095697 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.095708 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.198241 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.198340 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.198361 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.198390 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.198409 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.302033 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.302119 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.302148 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.302179 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.302205 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.406013 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.406112 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.406132 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.406161 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.406183 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.509408 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.509466 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.509485 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.509509 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.509527 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.612940 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.613000 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.613021 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.613047 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.613063 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.716852 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.716904 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.716928 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.716953 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.716971 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.820432 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.820483 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.820500 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.820522 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.820542 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.924061 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.924495 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.924657 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.924797 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:06 crc kubenswrapper[4744]: I1205 20:12:06.924943 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:06Z","lastTransitionTime":"2025-12-05T20:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.027895 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.027972 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.027990 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.028016 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.028033 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.079616 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:07 crc kubenswrapper[4744]: E1205 20:12:07.079788 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.079896 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.079639 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:07 crc kubenswrapper[4744]: E1205 20:12:07.080082 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:07 crc kubenswrapper[4744]: E1205 20:12:07.080129 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.130991 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.131049 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.131068 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.131092 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.131111 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.234154 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.234191 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.234203 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.234221 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.234233 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.340551 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.340593 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.340602 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.340617 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.340627 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.443728 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.443764 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.443771 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.443785 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.443796 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.546686 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.546734 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.546752 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.546777 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.546793 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.650172 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.650536 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.650757 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.650905 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.651056 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.753459 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.753513 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.753531 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.753553 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.753569 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.857102 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.857359 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.857397 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.857427 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.857448 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.960342 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.960413 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.960440 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.960470 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:07 crc kubenswrapper[4744]: I1205 20:12:07.960490 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:07Z","lastTransitionTime":"2025-12-05T20:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.062865 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.063158 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.063251 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.063389 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.063501 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.079978 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:08 crc kubenswrapper[4744]: E1205 20:12:08.080289 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.165896 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.165966 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.165990 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.166021 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.166045 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.268433 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.268482 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.268494 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.268511 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.268525 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.371691 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.371776 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.371802 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.371837 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.371866 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.475686 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.475748 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.475765 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.475788 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.475809 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.579026 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.579091 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.579110 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.579137 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.579156 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.682741 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.682825 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.682850 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.682880 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.682907 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.785730 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.785803 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.785822 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.785856 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.785891 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.864128 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs\") pod \"network-metrics-daemon-cgjbb\" (UID: \"9d0c84c8-b581-47ce-8cb8-956d3ef79238\") " pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:08 crc kubenswrapper[4744]: E1205 20:12:08.864429 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:08 crc kubenswrapper[4744]: E1205 20:12:08.864537 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs podName:9d0c84c8-b581-47ce-8cb8-956d3ef79238 nodeName:}" failed. No retries permitted until 2025-12-05 20:13:12.864509396 +0000 UTC m=+163.094320804 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs") pod "network-metrics-daemon-cgjbb" (UID: "9d0c84c8-b581-47ce-8cb8-956d3ef79238") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.888793 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.888850 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.888867 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.888890 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.888907 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.992121 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.992197 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.992225 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.992253 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:08 crc kubenswrapper[4744]: I1205 20:12:08.992275 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:08Z","lastTransitionTime":"2025-12-05T20:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.079836 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.079878 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.079972 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:09 crc kubenswrapper[4744]: E1205 20:12:09.080180 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:09 crc kubenswrapper[4744]: E1205 20:12:09.080419 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:09 crc kubenswrapper[4744]: E1205 20:12:09.080794 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.095136 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.095187 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.095205 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.095256 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.095286 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.198560 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.198614 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.198630 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.198657 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.198679 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.302492 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.302556 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.302574 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.302597 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.302617 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.406219 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.406277 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.406321 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.406348 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.406368 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.509084 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.509142 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.509162 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.509188 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.509205 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.612662 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.612730 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.612753 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.612852 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.612871 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.715262 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.715327 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.715340 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.715355 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.715363 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.818200 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.818234 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.818242 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.818255 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.818264 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.921212 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.921270 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.921330 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.921362 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:09 crc kubenswrapper[4744]: I1205 20:12:09.921382 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:09Z","lastTransitionTime":"2025-12-05T20:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.023467 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.023535 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.023554 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.023578 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.023595 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.081401 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:10 crc kubenswrapper[4744]: E1205 20:12:10.081537 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.104332 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jrcln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd4e5b0-9a0c-4819-9f3b-e13521e44b41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2c4cfc44d8c561fd80794464c5e89f6c11ae9cd6c4e41db3f20e008a55a718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e767ee28d938c9ff24270885d0adbe9890e504391d735c92bcc30aea8bfe9e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50567653c11bd794078e7bc45ce52db47d04d88d952c1f22472575c333b437f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec58daa4f9a32e4af04b9d895ae42c11b18b0078f6c13e2e58fe845fd32e294\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be30043abdbd08b9517e8d0f6bc5dfaa5ce6792128bd283ea26b1c27cf7d2526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9f34f235bf0a16fa763b949a4abb1844ae8f3fdab90a6345aa9721e51d05040\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7eaeb1792943ee8c00fa24623efa79a197941813ac16e6fd729f1a5df995fa40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8r2zx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jrcln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.122531 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a0d066c-089c-42c0-9ae0-480eb0ad2449\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2202785bde164d4a280e7869c6fcea433591d861b9ebceeb441f42e7a44552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae00d1d4dfc9390ee465cc444d3bfc55318ed5b8a4c27c8ed05cc2be77e6d0a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.127377 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.127419 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.127431 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.127459 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.127475 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.143325 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5846a3-5e6a-41aa-9760-c3cbd1ae2435\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3cb0c6029e9d18a57a79c23494dfa0c9f0edb458067341b7edd7f172d15f49f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a563f5bbd35e353c4f1763fdc0d084cd4bc94f57fb048205dd02dcadbac4e74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://712fb551ba5ca5e933dcb56b5d5d89d892320c9e52da2d46da7e19133939ef79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60cf98fec710687b48e20894dd8b3d487a3f4accfef1ff66c8aa4918c5d47440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.165148 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.185376 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7qlm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89bdeba9-f644-4465-a9f8-82c682f6aea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab97d51a3279ce570cf3560d86cc5052f5e9bbd25e84afcca05bcce623fc34c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:37Z\\\",\\\"message\\\":\\\"2025-12-05T20:10:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a\\\\n2025-12-05T20:10:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_87b863ce-1c03-488b-ad9b-135045a3589a to /host/opt/cni/bin/\\\\n2025-12-05T20:10:52Z [verbose] multus-daemon started\\\\n2025-12-05T20:10:52Z [verbose] Readiness Indicator file check\\\\n2025-12-05T20:11:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x692c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7qlm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.201702 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d0c84c8-b581-47ce-8cb8-956d3ef79238\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-csrlv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cgjbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.225767 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89cf5d8f-6aa0-46b8-b362-ff9d20d24af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77efd2bfbb8e1a9e913409410058dca6ce28ecb43be088996e1f7a3d36fdf714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a6476186762ee9cf236720ee91111937afea9507e0ce4431d605610e237f9c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9f792a3bfef074cca78e264bd406b32dc77417e7659afb61a65d8c9e93c279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73de011ae8a7d56434cbcfe978a08d96324e8e9cac392215b009e00bc0cadd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bd6cc6f1fc04807112d8f8a1700328384223850a80316d27e0f4fff7076b08a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0cd673ccd7aadef72ed4ff5aba22cbb0311701035e9694ef1abeee1654f2111\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ab5d1df4eb68b1b647878a4d76a1e3276c695fd8d4d5375e85d01adf311322\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://983d3d3b95996ea1a6f95d4d6af6f9b06e592dc9fca67952b409ac612e2c3798\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.230144 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.230167 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.230206 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.230221 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.230230 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.243994 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f44d0ab9-2456-45d6-bb68-fbc933c751a1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 20:10:42.519865 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 20:10:42.522607 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1093749926/tls.crt::/tmp/serving-cert-1093749926/tls.key\\\\\\\"\\\\nI1205 20:10:48.603371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 20:10:48.606810 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 20:10:48.606825 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 20:10:48.606855 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 20:10:48.606880 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 20:10:48.611777 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1205 20:10:48.611786 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1205 20:10:48.611822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 20:10:48.611838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 20:10:48.611845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 20:10:48.611851 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 20:10:48.611857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1205 20:10:48.615588 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.262417 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.281447 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41f5fa4584b5eb7490eb18ac12421d55764219330aa09ff070286c90b6e5c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b749a564f3c95c5790487d633e7912bcd3f136fc61b43519b9a6316c61da0a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.301568 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"582ca93a-2c1d-43e3-ba9b-4e4b80d723ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce09de6307eda8a6d9028890cdde887fc1e7788fbf532fc1aa42cb29a5655144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4dc73b152ef74e14668659413c71f1857e254a0c619efbf072b90896d08cff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a923e2e42c56836088fad2bbeb0238e69dc97610f52a511218476cc41d3ad703\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.318747 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jsdsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5969bfd5-aba0-4d9f-9b90-16de741c404a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f413a5b69be7baf19503ade03ac5c3eae234041fac775e869df0e5f2b85ba2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff2h6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jsdsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.333543 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dddz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df8d9ec8-e8fd-4d2f-bd06-0d082a38e4ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b63a64df80fb02704f1ad4c8910c279558ec4838aad96490d99aa1d550a2309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zlf5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dddz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.334083 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.334136 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.334154 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.334181 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.334199 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.365894 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T20:11:45Z\\\",\\\"message\\\":\\\"lversions/factory.go:140\\\\nI1205 20:11:45.545630 6745 factory.go:656] Stopping watch factory\\\\nI1205 20:11:45.545646 6745 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 20:11:45.545654 6745 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 20:11:45.545543 6745 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:11:45.545679 6745 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 20:11:45.545687 6745 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 20:11:45.545693 6745 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 20:11:45.545878 6745 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 20:11:45.546194 6745 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 20:11:45.546368 6745 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T20:11:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97hdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bk4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.383168 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9867a450-a95a-41ea-9d64-21f01814ed73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420548750ed3970bfbd6d5d2120fa9809cc4af22453f65c54740f621216cf2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8bd5f24842bc78463e0c9da4eb20dc198324d169617768ea724a6a8c114d8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:11:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wltcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:11:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2rtm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.404007 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cc2303528f895b9a2c39d8ccc299d6e2ff0702535fc34780987323c45ca8211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.423379 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285b8b09def6a8afc56e3b6c95165b5bef58262f9bad19cff90a0822f5dcae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.437766 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.437832 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.437849 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.437876 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.437898 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.447217 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.470741 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e25986a8-4343-4c98-bc53-6c1b077661f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T20:10:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d37823028ad82933ff04e0bc9671617175e12cb8167a8522d0c029e766341a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T20:10:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9xzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T20:10:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkhvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T20:12:10Z is after 2025-08-24T17:21:41Z" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.546419 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.546482 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.546499 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.546524 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.546542 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.650137 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.650534 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.650663 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.650926 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.651062 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.755159 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.755212 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.755224 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.755250 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.755264 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.858994 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.859060 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.859070 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.859094 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.859109 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.962737 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.962801 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.962818 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.962843 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:10 crc kubenswrapper[4744]: I1205 20:12:10.962860 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:10Z","lastTransitionTime":"2025-12-05T20:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.066846 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.066887 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.066897 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.066912 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.066923 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.080693 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.080784 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.080822 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:11 crc kubenswrapper[4744]: E1205 20:12:11.081020 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:11 crc kubenswrapper[4744]: E1205 20:12:11.081200 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:11 crc kubenswrapper[4744]: E1205 20:12:11.081356 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.169839 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.169933 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.169962 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.169994 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.170018 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.272524 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.272566 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.272577 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.272596 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.272608 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.375708 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.375772 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.375789 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.375813 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.375831 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.479243 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.479356 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.479381 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.479461 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.479491 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.582467 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.582837 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.583023 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.583257 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.583499 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.686854 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.686925 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.686947 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.686975 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.686998 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.789865 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.789931 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.789949 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.789973 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.789994 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.893069 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.893106 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.893117 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.893132 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.893145 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.996250 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.996355 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.996380 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.996411 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:11 crc kubenswrapper[4744]: I1205 20:12:11.996438 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:11Z","lastTransitionTime":"2025-12-05T20:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.080064 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:12 crc kubenswrapper[4744]: E1205 20:12:12.080256 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.099340 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.099411 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.099435 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.099463 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.099486 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.202199 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.202275 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.202342 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.202374 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.202400 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.305422 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.305488 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.305505 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.305529 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.305547 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.408851 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.408903 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.408920 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.408943 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.408960 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.511473 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.511529 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.511546 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.511568 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.511584 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.613686 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.613741 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.613758 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.613783 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.613799 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.716759 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.716818 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.716834 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.716857 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.716874 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.820154 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.820200 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.820217 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.820240 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.820257 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.923140 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.923230 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.923249 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.923270 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:12 crc kubenswrapper[4744]: I1205 20:12:12.923287 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:12Z","lastTransitionTime":"2025-12-05T20:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.025402 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.025467 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.025484 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.025511 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.025536 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.080217 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.080340 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.080587 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:13 crc kubenswrapper[4744]: E1205 20:12:13.080733 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:13 crc kubenswrapper[4744]: E1205 20:12:13.080834 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:13 crc kubenswrapper[4744]: E1205 20:12:13.080988 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.128713 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.128773 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.128791 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.128815 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.128836 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.232535 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.232881 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.232923 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.232949 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.233002 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.336115 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.336187 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.336204 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.336228 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.336246 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.439056 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.439123 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.439139 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.439164 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.439182 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.542080 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.542142 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.542161 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.542184 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.542202 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.645162 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.645211 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.645227 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.645249 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.645264 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.748091 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.748159 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.748182 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.748211 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.748232 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.851399 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.851464 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.851481 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.851503 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.851523 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.954440 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.954501 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.954519 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.954542 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:13 crc kubenswrapper[4744]: I1205 20:12:13.954561 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:13Z","lastTransitionTime":"2025-12-05T20:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.058387 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.058485 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.058506 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.058529 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.058548 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.080576 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:14 crc kubenswrapper[4744]: E1205 20:12:14.080755 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.081790 4744 scope.go:117] "RemoveContainer" containerID="c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9" Dec 05 20:12:14 crc kubenswrapper[4744]: E1205 20:12:14.082050 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.161751 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.161811 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.161829 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.161853 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.161871 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.264851 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.264892 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.264902 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.264919 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.264931 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.367716 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.367782 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.367798 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.367828 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.367848 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.470527 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.470591 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.470609 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.470633 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.470650 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.507867 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.507939 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.507964 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.507996 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.508019 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T20:12:14Z","lastTransitionTime":"2025-12-05T20:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.575219 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh"] Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.575832 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.578572 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.578713 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.579136 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.582065 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.626682 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ac5fa48e-d97a-4174-a48f-e51c6e5c6697-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2x5bh\" (UID: \"ac5fa48e-d97a-4174-a48f-e51c6e5c6697\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.626715 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ac5fa48e-d97a-4174-a48f-e51c6e5c6697-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2x5bh\" (UID: \"ac5fa48e-d97a-4174-a48f-e51c6e5c6697\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.626751 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac5fa48e-d97a-4174-a48f-e51c6e5c6697-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2x5bh\" (UID: \"ac5fa48e-d97a-4174-a48f-e51c6e5c6697\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.626771 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac5fa48e-d97a-4174-a48f-e51c6e5c6697-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2x5bh\" (UID: \"ac5fa48e-d97a-4174-a48f-e51c6e5c6697\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.626786 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac5fa48e-d97a-4174-a48f-e51c6e5c6697-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2x5bh\" (UID: \"ac5fa48e-d97a-4174-a48f-e51c6e5c6697\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.662692 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=86.662671883 podStartE2EDuration="1m26.662671883s" podCreationTimestamp="2025-12-05 20:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:14.662505489 +0000 UTC m=+104.892316867" watchObservedRunningTime="2025-12-05 20:12:14.662671883 +0000 UTC m=+104.892483261" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.696844 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=86.696820948 podStartE2EDuration="1m26.696820948s" podCreationTimestamp="2025-12-05 20:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:14.693651797 +0000 UTC m=+104.923463185" watchObservedRunningTime="2025-12-05 20:12:14.696820948 +0000 UTC m=+104.926632326" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.706136 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9dddz" podStartSLOduration=84.706111003 podStartE2EDuration="1m24.706111003s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:14.705645571 +0000 UTC m=+104.935456959" watchObservedRunningTime="2025-12-05 20:12:14.706111003 +0000 UTC m=+104.935922381" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.728376 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac5fa48e-d97a-4174-a48f-e51c6e5c6697-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2x5bh\" (UID: \"ac5fa48e-d97a-4174-a48f-e51c6e5c6697\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.728446 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac5fa48e-d97a-4174-a48f-e51c6e5c6697-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2x5bh\" (UID: \"ac5fa48e-d97a-4174-a48f-e51c6e5c6697\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.728515 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ac5fa48e-d97a-4174-a48f-e51c6e5c6697-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2x5bh\" (UID: \"ac5fa48e-d97a-4174-a48f-e51c6e5c6697\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.728550 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ac5fa48e-d97a-4174-a48f-e51c6e5c6697-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2x5bh\" (UID: \"ac5fa48e-d97a-4174-a48f-e51c6e5c6697\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.728585 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac5fa48e-d97a-4174-a48f-e51c6e5c6697-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2x5bh\" (UID: \"ac5fa48e-d97a-4174-a48f-e51c6e5c6697\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.728660 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ac5fa48e-d97a-4174-a48f-e51c6e5c6697-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2x5bh\" (UID: \"ac5fa48e-d97a-4174-a48f-e51c6e5c6697\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.728692 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ac5fa48e-d97a-4174-a48f-e51c6e5c6697-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2x5bh\" (UID: \"ac5fa48e-d97a-4174-a48f-e51c6e5c6697\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.729629 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac5fa48e-d97a-4174-a48f-e51c6e5c6697-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2x5bh\" (UID: \"ac5fa48e-d97a-4174-a48f-e51c6e5c6697\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.734791 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac5fa48e-d97a-4174-a48f-e51c6e5c6697-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2x5bh\" (UID: \"ac5fa48e-d97a-4174-a48f-e51c6e5c6697\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.741131 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=79.741106198 podStartE2EDuration="1m19.741106198s" podCreationTimestamp="2025-12-05 20:10:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:14.724986521 +0000 UTC m=+104.954797919" watchObservedRunningTime="2025-12-05 20:12:14.741106198 +0000 UTC m=+104.970917596" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.741604 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jsdsn" podStartSLOduration=86.741596291 podStartE2EDuration="1m26.741596291s" podCreationTimestamp="2025-12-05 20:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:14.741092228 +0000 UTC m=+104.970903616" watchObservedRunningTime="2025-12-05 20:12:14.741596291 +0000 UTC m=+104.971407699" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.759908 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac5fa48e-d97a-4174-a48f-e51c6e5c6697-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2x5bh\" (UID: \"ac5fa48e-d97a-4174-a48f-e51c6e5c6697\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.768785 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podStartSLOduration=84.768764069 podStartE2EDuration="1m24.768764069s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:14.768312617 +0000 UTC m=+104.998123995" watchObservedRunningTime="2025-12-05 20:12:14.768764069 +0000 UTC m=+104.998575437" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.805005 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2rtm" podStartSLOduration=83.804984165 podStartE2EDuration="1m23.804984165s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:14.804274207 +0000 UTC m=+105.034085595" watchObservedRunningTime="2025-12-05 20:12:14.804984165 +0000 UTC m=+105.034795543" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.880470 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7qlm7" podStartSLOduration=84.880446006 podStartE2EDuration="1m24.880446006s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:14.880261841 +0000 UTC m=+105.110073219" watchObservedRunningTime="2025-12-05 20:12:14.880446006 +0000 UTC m=+105.110257394" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.894954 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.906736 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jrcln" podStartSLOduration=84.90671413 podStartE2EDuration="1m24.90671413s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:14.906400942 +0000 UTC m=+105.136212320" watchObservedRunningTime="2025-12-05 20:12:14.90671413 +0000 UTC m=+105.136525528" Dec 05 20:12:14 crc kubenswrapper[4744]: W1205 20:12:14.907845 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5fa48e_d97a_4174_a48f_e51c6e5c6697.slice/crio-eb5cc3bd92545b2f6f22f8f994d10d2ba371bba588ae4b22b9c9aa3d5414ae99 WatchSource:0}: Error finding container eb5cc3bd92545b2f6f22f8f994d10d2ba371bba588ae4b22b9c9aa3d5414ae99: Status 404 returned error can't find the container with id eb5cc3bd92545b2f6f22f8f994d10d2ba371bba588ae4b22b9c9aa3d5414ae99 Dec 05 20:12:14 crc kubenswrapper[4744]: I1205 20:12:14.920517 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=38.920495189 podStartE2EDuration="38.920495189s" podCreationTimestamp="2025-12-05 20:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:14.919188606 +0000 UTC m=+105.149000004" watchObservedRunningTime="2025-12-05 20:12:14.920495189 +0000 UTC m=+105.150306577" Dec 05 20:12:15 crc kubenswrapper[4744]: I1205 20:12:15.080395 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:15 crc kubenswrapper[4744]: I1205 20:12:15.080470 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:15 crc kubenswrapper[4744]: E1205 20:12:15.080510 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:15 crc kubenswrapper[4744]: E1205 20:12:15.080642 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:15 crc kubenswrapper[4744]: I1205 20:12:15.080399 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:15 crc kubenswrapper[4744]: E1205 20:12:15.080761 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:15 crc kubenswrapper[4744]: I1205 20:12:15.766341 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" event={"ID":"ac5fa48e-d97a-4174-a48f-e51c6e5c6697","Type":"ContainerStarted","Data":"3b7e415066f5850e3decd3ee8fe281563c1df8520b71487618ac2f12ad49cb91"} Dec 05 20:12:15 crc kubenswrapper[4744]: I1205 20:12:15.766417 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" event={"ID":"ac5fa48e-d97a-4174-a48f-e51c6e5c6697","Type":"ContainerStarted","Data":"eb5cc3bd92545b2f6f22f8f994d10d2ba371bba588ae4b22b9c9aa3d5414ae99"} Dec 05 20:12:15 crc kubenswrapper[4744]: I1205 20:12:15.783667 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=51.783645076 podStartE2EDuration="51.783645076s" podCreationTimestamp="2025-12-05 20:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:14.938870554 +0000 UTC m=+105.168681922" watchObservedRunningTime="2025-12-05 20:12:15.783645076 +0000 UTC m=+106.013456484" Dec 05 20:12:16 crc kubenswrapper[4744]: I1205 20:12:16.080428 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:16 crc kubenswrapper[4744]: E1205 20:12:16.080641 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:17 crc kubenswrapper[4744]: I1205 20:12:17.080615 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:17 crc kubenswrapper[4744]: E1205 20:12:17.080851 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:17 crc kubenswrapper[4744]: I1205 20:12:17.081260 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:17 crc kubenswrapper[4744]: E1205 20:12:17.081451 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:17 crc kubenswrapper[4744]: I1205 20:12:17.081600 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:17 crc kubenswrapper[4744]: E1205 20:12:17.081779 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:18 crc kubenswrapper[4744]: I1205 20:12:18.080704 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:18 crc kubenswrapper[4744]: E1205 20:12:18.080949 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:19 crc kubenswrapper[4744]: I1205 20:12:19.079951 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:19 crc kubenswrapper[4744]: I1205 20:12:19.079985 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:19 crc kubenswrapper[4744]: E1205 20:12:19.080133 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:19 crc kubenswrapper[4744]: E1205 20:12:19.080268 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:19 crc kubenswrapper[4744]: I1205 20:12:19.079984 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:19 crc kubenswrapper[4744]: E1205 20:12:19.080582 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:20 crc kubenswrapper[4744]: I1205 20:12:20.080615 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:20 crc kubenswrapper[4744]: E1205 20:12:20.081865 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:21 crc kubenswrapper[4744]: I1205 20:12:21.080344 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:21 crc kubenswrapper[4744]: I1205 20:12:21.080532 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:21 crc kubenswrapper[4744]: I1205 20:12:21.080613 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:21 crc kubenswrapper[4744]: E1205 20:12:21.080889 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:21 crc kubenswrapper[4744]: E1205 20:12:21.080979 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:21 crc kubenswrapper[4744]: E1205 20:12:21.081149 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:22 crc kubenswrapper[4744]: I1205 20:12:22.080035 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:22 crc kubenswrapper[4744]: E1205 20:12:22.080262 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:23 crc kubenswrapper[4744]: I1205 20:12:23.079692 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:23 crc kubenswrapper[4744]: I1205 20:12:23.079742 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:23 crc kubenswrapper[4744]: I1205 20:12:23.079920 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:23 crc kubenswrapper[4744]: E1205 20:12:23.079948 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:23 crc kubenswrapper[4744]: E1205 20:12:23.080069 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:23 crc kubenswrapper[4744]: E1205 20:12:23.080193 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:23 crc kubenswrapper[4744]: I1205 20:12:23.799007 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7qlm7_89bdeba9-f644-4465-a9f8-82c682f6aea3/kube-multus/1.log" Dec 05 20:12:23 crc kubenswrapper[4744]: I1205 20:12:23.799843 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7qlm7_89bdeba9-f644-4465-a9f8-82c682f6aea3/kube-multus/0.log" Dec 05 20:12:23 crc kubenswrapper[4744]: I1205 20:12:23.799921 4744 generic.go:334] "Generic (PLEG): container finished" podID="89bdeba9-f644-4465-a9f8-82c682f6aea3" containerID="6ab97d51a3279ce570cf3560d86cc5052f5e9bbd25e84afcca05bcce623fc34c" exitCode=1 Dec 05 20:12:23 crc kubenswrapper[4744]: I1205 20:12:23.799959 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7qlm7" event={"ID":"89bdeba9-f644-4465-a9f8-82c682f6aea3","Type":"ContainerDied","Data":"6ab97d51a3279ce570cf3560d86cc5052f5e9bbd25e84afcca05bcce623fc34c"} Dec 05 20:12:23 crc kubenswrapper[4744]: I1205 20:12:23.800001 4744 scope.go:117] "RemoveContainer" containerID="07a01dd20b61ab68edd005b45ae4f0974db05dc3b59dfc6ad4c592e8562e9547" Dec 05 20:12:23 crc kubenswrapper[4744]: I1205 20:12:23.800410 4744 scope.go:117] "RemoveContainer" containerID="6ab97d51a3279ce570cf3560d86cc5052f5e9bbd25e84afcca05bcce623fc34c" Dec 05 20:12:23 crc kubenswrapper[4744]: E1205 20:12:23.800549 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-7qlm7_openshift-multus(89bdeba9-f644-4465-a9f8-82c682f6aea3)\"" pod="openshift-multus/multus-7qlm7" podUID="89bdeba9-f644-4465-a9f8-82c682f6aea3" Dec 05 20:12:23 crc kubenswrapper[4744]: I1205 20:12:23.828984 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x5bh" podStartSLOduration=93.82895734 podStartE2EDuration="1m33.82895734s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:15.782443936 +0000 UTC m=+106.012255394" watchObservedRunningTime="2025-12-05 20:12:23.82895734 +0000 UTC m=+114.058768748" Dec 05 20:12:24 crc kubenswrapper[4744]: I1205 20:12:24.080436 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:24 crc kubenswrapper[4744]: E1205 20:12:24.080998 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:24 crc kubenswrapper[4744]: I1205 20:12:24.805637 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7qlm7_89bdeba9-f644-4465-a9f8-82c682f6aea3/kube-multus/1.log" Dec 05 20:12:25 crc kubenswrapper[4744]: I1205 20:12:25.079826 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:25 crc kubenswrapper[4744]: I1205 20:12:25.079930 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:25 crc kubenswrapper[4744]: I1205 20:12:25.079827 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:25 crc kubenswrapper[4744]: E1205 20:12:25.080065 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:25 crc kubenswrapper[4744]: E1205 20:12:25.080168 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:25 crc kubenswrapper[4744]: E1205 20:12:25.080842 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:25 crc kubenswrapper[4744]: I1205 20:12:25.081157 4744 scope.go:117] "RemoveContainer" containerID="c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9" Dec 05 20:12:25 crc kubenswrapper[4744]: E1205 20:12:25.081469 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bk4n_openshift-ovn-kubernetes(99bea8e6-6eff-4db0-8e98-20a5ae64e0d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" Dec 05 20:12:26 crc kubenswrapper[4744]: I1205 20:12:26.080589 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:26 crc kubenswrapper[4744]: E1205 20:12:26.080810 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:27 crc kubenswrapper[4744]: I1205 20:12:27.079866 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:27 crc kubenswrapper[4744]: I1205 20:12:27.079906 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:27 crc kubenswrapper[4744]: I1205 20:12:27.079922 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:27 crc kubenswrapper[4744]: E1205 20:12:27.080030 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:27 crc kubenswrapper[4744]: E1205 20:12:27.080172 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:27 crc kubenswrapper[4744]: E1205 20:12:27.080499 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:28 crc kubenswrapper[4744]: I1205 20:12:28.080018 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:28 crc kubenswrapper[4744]: E1205 20:12:28.080546 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:29 crc kubenswrapper[4744]: I1205 20:12:29.080606 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:29 crc kubenswrapper[4744]: I1205 20:12:29.080650 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:29 crc kubenswrapper[4744]: I1205 20:12:29.080736 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:29 crc kubenswrapper[4744]: E1205 20:12:29.080800 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:29 crc kubenswrapper[4744]: E1205 20:12:29.080922 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:29 crc kubenswrapper[4744]: E1205 20:12:29.081104 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:30 crc kubenswrapper[4744]: E1205 20:12:30.080052 4744 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 05 20:12:30 crc kubenswrapper[4744]: I1205 20:12:30.080265 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:30 crc kubenswrapper[4744]: E1205 20:12:30.082255 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:30 crc kubenswrapper[4744]: E1205 20:12:30.166497 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:12:31 crc kubenswrapper[4744]: I1205 20:12:31.079880 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:31 crc kubenswrapper[4744]: I1205 20:12:31.079880 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:31 crc kubenswrapper[4744]: E1205 20:12:31.080057 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:31 crc kubenswrapper[4744]: E1205 20:12:31.080178 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:31 crc kubenswrapper[4744]: I1205 20:12:31.079904 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:31 crc kubenswrapper[4744]: E1205 20:12:31.080278 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:32 crc kubenswrapper[4744]: I1205 20:12:32.080107 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:32 crc kubenswrapper[4744]: E1205 20:12:32.080331 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:33 crc kubenswrapper[4744]: I1205 20:12:33.079591 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:33 crc kubenswrapper[4744]: I1205 20:12:33.079675 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:33 crc kubenswrapper[4744]: I1205 20:12:33.079593 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:33 crc kubenswrapper[4744]: E1205 20:12:33.079761 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:33 crc kubenswrapper[4744]: E1205 20:12:33.079915 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:33 crc kubenswrapper[4744]: E1205 20:12:33.080080 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:34 crc kubenswrapper[4744]: I1205 20:12:34.080093 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:34 crc kubenswrapper[4744]: E1205 20:12:34.080366 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:35 crc kubenswrapper[4744]: I1205 20:12:35.079954 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:35 crc kubenswrapper[4744]: I1205 20:12:35.079957 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:35 crc kubenswrapper[4744]: E1205 20:12:35.080124 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:35 crc kubenswrapper[4744]: E1205 20:12:35.080283 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:35 crc kubenswrapper[4744]: I1205 20:12:35.080104 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:35 crc kubenswrapper[4744]: E1205 20:12:35.080585 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:35 crc kubenswrapper[4744]: E1205 20:12:35.168157 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:12:36 crc kubenswrapper[4744]: I1205 20:12:36.079998 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:36 crc kubenswrapper[4744]: E1205 20:12:36.080507 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:36 crc kubenswrapper[4744]: I1205 20:12:36.080542 4744 scope.go:117] "RemoveContainer" containerID="6ab97d51a3279ce570cf3560d86cc5052f5e9bbd25e84afcca05bcce623fc34c" Dec 05 20:12:36 crc kubenswrapper[4744]: I1205 20:12:36.081664 4744 scope.go:117] "RemoveContainer" containerID="c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9" Dec 05 20:12:36 crc kubenswrapper[4744]: I1205 20:12:36.851730 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovnkube-controller/3.log" Dec 05 20:12:36 crc kubenswrapper[4744]: I1205 20:12:36.855187 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerStarted","Data":"82823500d1248bb0c059dbb22c93d962b48fbe35255bd4337304866b2a19b887"} Dec 05 20:12:36 crc kubenswrapper[4744]: I1205 20:12:36.856038 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:12:36 crc kubenswrapper[4744]: I1205 20:12:36.856860 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7qlm7_89bdeba9-f644-4465-a9f8-82c682f6aea3/kube-multus/1.log" Dec 05 20:12:36 crc kubenswrapper[4744]: I1205 20:12:36.856894 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7qlm7" event={"ID":"89bdeba9-f644-4465-a9f8-82c682f6aea3","Type":"ContainerStarted","Data":"158a06cf97c1029c61e484aea0506a8356678a2eb865af54482cad3a1605bc60"} Dec 05 20:12:36 crc kubenswrapper[4744]: I1205 20:12:36.895140 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podStartSLOduration=106.895123931 podStartE2EDuration="1m46.895123931s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:36.894849595 +0000 UTC m=+127.124660963" watchObservedRunningTime="2025-12-05 20:12:36.895123931 +0000 UTC m=+127.124935299" Dec 05 20:12:37 crc kubenswrapper[4744]: I1205 20:12:37.080046 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:37 crc kubenswrapper[4744]: I1205 20:12:37.080095 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:37 crc kubenswrapper[4744]: I1205 20:12:37.080198 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:37 crc kubenswrapper[4744]: E1205 20:12:37.080199 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:37 crc kubenswrapper[4744]: E1205 20:12:37.080349 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:37 crc kubenswrapper[4744]: E1205 20:12:37.080400 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:37 crc kubenswrapper[4744]: I1205 20:12:37.092861 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cgjbb"] Dec 05 20:12:37 crc kubenswrapper[4744]: I1205 20:12:37.093017 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:37 crc kubenswrapper[4744]: E1205 20:12:37.093174 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:39 crc kubenswrapper[4744]: I1205 20:12:39.080239 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:39 crc kubenswrapper[4744]: I1205 20:12:39.080285 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:39 crc kubenswrapper[4744]: I1205 20:12:39.080371 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:39 crc kubenswrapper[4744]: I1205 20:12:39.080384 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:39 crc kubenswrapper[4744]: E1205 20:12:39.082688 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 20:12:39 crc kubenswrapper[4744]: E1205 20:12:39.082474 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 20:12:39 crc kubenswrapper[4744]: E1205 20:12:39.082118 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 20:12:39 crc kubenswrapper[4744]: E1205 20:12:39.082767 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cgjbb" podUID="9d0c84c8-b581-47ce-8cb8-956d3ef79238" Dec 05 20:12:41 crc kubenswrapper[4744]: I1205 20:12:41.080159 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:41 crc kubenswrapper[4744]: I1205 20:12:41.080206 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:41 crc kubenswrapper[4744]: I1205 20:12:41.080277 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:41 crc kubenswrapper[4744]: I1205 20:12:41.080174 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:12:41 crc kubenswrapper[4744]: I1205 20:12:41.082910 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 20:12:41 crc kubenswrapper[4744]: I1205 20:12:41.082932 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 20:12:41 crc kubenswrapper[4744]: I1205 20:12:41.083727 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 20:12:41 crc kubenswrapper[4744]: I1205 20:12:41.084157 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 20:12:41 crc kubenswrapper[4744]: I1205 20:12:41.085992 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 20:12:41 crc kubenswrapper[4744]: I1205 20:12:41.086059 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.125812 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.181354 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.181908 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.182178 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.182757 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.184603 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-r5krf"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.185553 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.185913 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ccqxf"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.186212 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.186223 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.186708 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.189090 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.189379 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.194768 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.194795 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.195049 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.197097 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.197093 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.199219 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.207779 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.208352 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qv6mb"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.208387 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.208668 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.208713 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.208762 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.214141 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.217648 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.217245 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.221149 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.221397 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.221706 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.234535 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.234694 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.234798 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.234946 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.235114 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.235235 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.235546 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.236052 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.236341 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-htzxr"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.236541 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.237281 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.237749 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.238022 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nxjnb"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.238514 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.238513 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.239932 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.245036 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.245255 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.245430 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.245683 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.245717 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-gw9l6"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.245915 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.246203 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l6gl7"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.246630 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.246881 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gw9l6" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.253470 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.253701 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.253754 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.253797 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.253754 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.253943 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.253951 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.254176 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.254390 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.254529 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.254887 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.254952 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.254998 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.255104 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.255198 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.255379 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.255890 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.256314 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.256459 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.256515 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.257024 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.257149 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.257262 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.257563 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.257705 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.257748 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.257873 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.257903 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.257965 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.258008 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.258185 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.258630 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.259392 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.259683 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-628ml"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.259947 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ql5gr"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.260278 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ql5gr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.260437 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.261477 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.261671 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.261873 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.262003 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.262116 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.262153 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.262194 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.262932 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.263169 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.264315 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.264829 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.266456 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.287863 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.288773 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.289884 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dn5pv"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.289935 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.291733 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.292142 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.292227 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.302965 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c941b3ea-ef53-47c4-b10a-6e949b7098d2-etcd-client\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.303270 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0-machine-approver-tls\") pod \"machine-approver-56656f9798-9rmhw\" (UID: \"7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.303427 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c941b3ea-ef53-47c4-b10a-6e949b7098d2-audit-policies\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.303520 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c941b3ea-ef53-47c4-b10a-6e949b7098d2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.303605 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-config\") pod \"controller-manager-879f6c89f-ccqxf\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.303701 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk56f\" (UniqueName: \"kubernetes.io/projected/c941b3ea-ef53-47c4-b10a-6e949b7098d2-kube-api-access-xk56f\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.303797 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9429a50e-b1ff-480d-b8af-d0f095f8cd86-serving-cert\") pod \"controller-manager-879f6c89f-ccqxf\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.303909 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0-config\") pod \"machine-approver-56656f9798-9rmhw\" (UID: \"7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.304029 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c941b3ea-ef53-47c4-b10a-6e949b7098d2-encryption-config\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.304128 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c941b3ea-ef53-47c4-b10a-6e949b7098d2-audit-dir\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.304227 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ccqxf\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.304565 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h44h4\" (UniqueName: \"kubernetes.io/projected/9429a50e-b1ff-480d-b8af-d0f095f8cd86-kube-api-access-h44h4\") pod \"controller-manager-879f6c89f-ccqxf\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.304678 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sd82\" (UniqueName: \"kubernetes.io/projected/7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0-kube-api-access-4sd82\") pod \"machine-approver-56656f9798-9rmhw\" (UID: \"7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.304770 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/818b6964-1c62-4e2e-8079-a41f9bdcb763-config\") pod \"machine-api-operator-5694c8668f-r5krf\" (UID: \"818b6964-1c62-4e2e-8079-a41f9bdcb763\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.304870 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4lcs\" (UniqueName: \"kubernetes.io/projected/818b6964-1c62-4e2e-8079-a41f9bdcb763-kube-api-access-v4lcs\") pod \"machine-api-operator-5694c8668f-r5krf\" (UID: \"818b6964-1c62-4e2e-8079-a41f9bdcb763\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.304977 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/818b6964-1c62-4e2e-8079-a41f9bdcb763-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-r5krf\" (UID: \"818b6964-1c62-4e2e-8079-a41f9bdcb763\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.305085 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c941b3ea-ef53-47c4-b10a-6e949b7098d2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.305180 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/818b6964-1c62-4e2e-8079-a41f9bdcb763-images\") pod \"machine-api-operator-5694c8668f-r5krf\" (UID: \"818b6964-1c62-4e2e-8079-a41f9bdcb763\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.305271 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c941b3ea-ef53-47c4-b10a-6e949b7098d2-serving-cert\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.305401 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0-auth-proxy-config\") pod \"machine-approver-56656f9798-9rmhw\" (UID: \"7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.305507 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-client-ca\") pod \"controller-manager-879f6c89f-ccqxf\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.304497 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.304593 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.304663 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.304707 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.304772 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.306811 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.307195 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.304845 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.308164 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.308415 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.305636 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.308893 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.309208 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.305662 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.310715 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ctwg7"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.306713 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.310828 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.316953 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-d6kbr"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.317523 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.310794 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.318259 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-d6kbr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.318507 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ctwg7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.318654 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.321448 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.321603 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.322987 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.323449 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.323758 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.324390 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.324900 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.324902 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.326135 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.326244 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.326476 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.326987 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.327073 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.328462 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-lgg2b"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.328771 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.331404 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.331892 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.332200 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.332396 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.334885 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.335236 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g2wxh"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.335735 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.335984 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-g2wxh" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.335587 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.336652 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.337078 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.337621 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.340188 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.342548 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.342711 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tr74j"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.343090 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.343374 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-lk6bf"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.345001 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.345844 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.346234 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.346734 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.352180 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.352514 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.353611 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.357069 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.358435 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zhpng"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.359610 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.361526 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9hjlq"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.362254 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.362940 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-r5krf"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.363989 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.367428 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qv6mb"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.368432 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ccqxf"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.376624 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ql5gr"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.377944 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-htzxr"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.379132 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.380242 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.381266 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.382505 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nxjnb"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.383624 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.383773 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.384764 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l6gl7"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.386345 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.387663 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.388858 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6f79s"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.389545 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6f79s" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.389945 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.391846 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.392832 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-628ml"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.394145 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gw9l6"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.394801 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.395721 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ctwg7"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.397851 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.398854 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.399875 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.401841 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lgg2b"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.401939 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.403277 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.404540 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.404938 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406468 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/79d58c0b-affd-462b-b4ee-1134ede8bcb5-node-pullsecrets\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406502 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/79d58c0b-affd-462b-b4ee-1134ede8bcb5-etcd-client\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406527 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv86m\" (UniqueName: \"kubernetes.io/projected/79d58c0b-affd-462b-b4ee-1134ede8bcb5-kube-api-access-zv86m\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406549 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5196711d-0b39-4630-a0bc-d210d210fc4b-webhook-cert\") pod \"packageserver-d55dfcdfc-j55sf\" (UID: \"5196711d-0b39-4630-a0bc-d210d210fc4b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406578 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/818b6964-1c62-4e2e-8079-a41f9bdcb763-images\") pod \"machine-api-operator-5694c8668f-r5krf\" (UID: \"818b6964-1c62-4e2e-8079-a41f9bdcb763\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406642 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c941b3ea-ef53-47c4-b10a-6e949b7098d2-serving-cert\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406680 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/59b4fd96-82d8-4cf5-a063-393b6f775e45-etcd-client\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406712 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/476c0833-0a8f-4824-a7fe-6f28aada483b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h777m\" (UID: \"476c0833-0a8f-4824-a7fe-6f28aada483b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406738 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/79d58c0b-affd-462b-b4ee-1134ede8bcb5-etcd-serving-ca\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406762 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b4fd96-82d8-4cf5-a063-393b6f775e45-config\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406787 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0-auth-proxy-config\") pod \"machine-approver-56656f9798-9rmhw\" (UID: \"7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406811 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-client-ca\") pod \"controller-manager-879f6c89f-ccqxf\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406838 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09993f0f-6381-4517-8246-ef1d188bea5c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rjqnz\" (UID: \"09993f0f-6381-4517-8246-ef1d188bea5c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406876 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/79d58c0b-affd-462b-b4ee-1134ede8bcb5-image-import-ca\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406905 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406928 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5196711d-0b39-4630-a0bc-d210d210fc4b-tmpfs\") pod \"packageserver-d55dfcdfc-j55sf\" (UID: \"5196711d-0b39-4630-a0bc-d210d210fc4b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.406954 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-registry-tls\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.407330 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c941b3ea-ef53-47c4-b10a-6e949b7098d2-etcd-client\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.407364 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/79d58c0b-affd-462b-b4ee-1134ede8bcb5-encryption-config\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.407386 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79d58c0b-affd-462b-b4ee-1134ede8bcb5-audit-dir\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.407498 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndxx8\" (UniqueName: \"kubernetes.io/projected/5196711d-0b39-4630-a0bc-d210d210fc4b-kube-api-access-ndxx8\") pod \"packageserver-d55dfcdfc-j55sf\" (UID: \"5196711d-0b39-4630-a0bc-d210d210fc4b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.407527 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfn84\" (UniqueName: \"kubernetes.io/projected/3532c9be-fdf5-43e2-b5ba-95a678fef5f8-kube-api-access-jfn84\") pod \"control-plane-machine-set-operator-78cbb6b69f-ql5gr\" (UID: \"3532c9be-fdf5-43e2-b5ba-95a678fef5f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ql5gr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.407757 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a971a99c-926f-48f4-88d5-9033085cc89b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lnwkb\" (UID: \"a971a99c-926f-48f4-88d5-9033085cc89b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.407796 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0-machine-approver-tls\") pod \"machine-approver-56656f9798-9rmhw\" (UID: \"7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.407825 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c941b3ea-ef53-47c4-b10a-6e949b7098d2-audit-policies\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.407849 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09993f0f-6381-4517-8246-ef1d188bea5c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rjqnz\" (UID: \"09993f0f-6381-4517-8246-ef1d188bea5c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz" Dec 05 20:12:45 crc kubenswrapper[4744]: E1205 20:12:45.407910 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:45.907888104 +0000 UTC m=+136.137699492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.408017 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a971a99c-926f-48f4-88d5-9033085cc89b-config\") pod \"kube-controller-manager-operator-78b949d7b-lnwkb\" (UID: \"a971a99c-926f-48f4-88d5-9033085cc89b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.408165 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c941b3ea-ef53-47c4-b10a-6e949b7098d2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.410796 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-config\") pod \"controller-manager-879f6c89f-ccqxf\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.408682 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-client-ca\") pod \"controller-manager-879f6c89f-ccqxf\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.409029 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c941b3ea-ef53-47c4-b10a-6e949b7098d2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.410745 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c941b3ea-ef53-47c4-b10a-6e949b7098d2-audit-policies\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412051 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0-auth-proxy-config\") pod \"machine-approver-56656f9798-9rmhw\" (UID: \"7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.408977 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dn5pv"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412247 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tr74j"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412263 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fd9747b-ba54-4fa6-8849-7447d6683c68-serving-cert\") pod \"authentication-operator-69f744f599-qv6mb\" (UID: \"2fd9747b-ba54-4fa6-8849-7447d6683c68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412314 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/98e5f65e-632c-4932-83cc-413ea5cac23a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412340 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj295\" (UniqueName: \"kubernetes.io/projected/53f9a23a-b663-4cbf-8c34-334f073e3092-kube-api-access-cj295\") pod \"route-controller-manager-6576b87f9c-9rv9c\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412369 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvqcr\" (UniqueName: \"kubernetes.io/projected/2dd0664e-36e7-48d4-bfbe-76cdf69883b6-kube-api-access-dvqcr\") pod \"downloads-7954f5f757-gw9l6\" (UID: \"2dd0664e-36e7-48d4-bfbe-76cdf69883b6\") " pod="openshift-console/downloads-7954f5f757-gw9l6" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412397 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk56f\" (UniqueName: \"kubernetes.io/projected/c941b3ea-ef53-47c4-b10a-6e949b7098d2-kube-api-access-xk56f\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412420 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9429a50e-b1ff-480d-b8af-d0f095f8cd86-serving-cert\") pod \"controller-manager-879f6c89f-ccqxf\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412444 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fktb\" (UniqueName: \"kubernetes.io/projected/476c0833-0a8f-4824-a7fe-6f28aada483b-kube-api-access-2fktb\") pod \"cluster-samples-operator-665b6dd947-h777m\" (UID: \"476c0833-0a8f-4824-a7fe-6f28aada483b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412468 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-bound-sa-token\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412506 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d58c0b-affd-462b-b4ee-1134ede8bcb5-config\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412526 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/98e5f65e-632c-4932-83cc-413ea5cac23a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412550 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53f9a23a-b663-4cbf-8c34-334f073e3092-serving-cert\") pod \"route-controller-manager-6576b87f9c-9rv9c\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412577 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5196711d-0b39-4630-a0bc-d210d210fc4b-apiservice-cert\") pod \"packageserver-d55dfcdfc-j55sf\" (UID: \"5196711d-0b39-4630-a0bc-d210d210fc4b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412599 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fd9747b-ba54-4fa6-8849-7447d6683c68-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qv6mb\" (UID: \"2fd9747b-ba54-4fa6-8849-7447d6683c68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412621 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98e5f65e-632c-4932-83cc-413ea5cac23a-trusted-ca\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412646 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0-config\") pod \"machine-approver-56656f9798-9rmhw\" (UID: \"7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418536 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c941b3ea-ef53-47c4-b10a-6e949b7098d2-encryption-config\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418651 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53f9a23a-b663-4cbf-8c34-334f073e3092-config\") pod \"route-controller-manager-6576b87f9c-9rv9c\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418681 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c941b3ea-ef53-47c4-b10a-6e949b7098d2-audit-dir\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418708 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tr8h\" (UniqueName: \"kubernetes.io/projected/09993f0f-6381-4517-8246-ef1d188bea5c-kube-api-access-5tr8h\") pod \"openshift-controller-manager-operator-756b6f6bc6-rjqnz\" (UID: \"09993f0f-6381-4517-8246-ef1d188bea5c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418731 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/59b4fd96-82d8-4cf5-a063-393b6f775e45-etcd-ca\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418757 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a971a99c-926f-48f4-88d5-9033085cc89b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lnwkb\" (UID: \"a971a99c-926f-48f4-88d5-9033085cc89b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418781 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc23ad84-d2b5-4f8b-a110-143219eb78a9-trusted-ca\") pod \"console-operator-58897d9998-nxjnb\" (UID: \"fc23ad84-d2b5-4f8b-a110-143219eb78a9\") " pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418801 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd9747b-ba54-4fa6-8849-7447d6683c68-config\") pod \"authentication-operator-69f744f599-qv6mb\" (UID: \"2fd9747b-ba54-4fa6-8849-7447d6683c68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418823 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fd9747b-ba54-4fa6-8849-7447d6683c68-service-ca-bundle\") pod \"authentication-operator-69f744f599-qv6mb\" (UID: \"2fd9747b-ba54-4fa6-8849-7447d6683c68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418845 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlvh7\" (UniqueName: \"kubernetes.io/projected/2fd9747b-ba54-4fa6-8849-7447d6683c68-kube-api-access-rlvh7\") pod \"authentication-operator-69f744f599-qv6mb\" (UID: \"2fd9747b-ba54-4fa6-8849-7447d6683c68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418865 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79d58c0b-affd-462b-b4ee-1134ede8bcb5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418889 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ccqxf\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418912 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h44h4\" (UniqueName: \"kubernetes.io/projected/9429a50e-b1ff-480d-b8af-d0f095f8cd86-kube-api-access-h44h4\") pod \"controller-manager-879f6c89f-ccqxf\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418934 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc23ad84-d2b5-4f8b-a110-143219eb78a9-serving-cert\") pod \"console-operator-58897d9998-nxjnb\" (UID: \"fc23ad84-d2b5-4f8b-a110-143219eb78a9\") " pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418956 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59b4fd96-82d8-4cf5-a063-393b6f775e45-serving-cert\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418978 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/79d58c0b-affd-462b-b4ee-1134ede8bcb5-audit\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419014 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sd82\" (UniqueName: \"kubernetes.io/projected/7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0-kube-api-access-4sd82\") pod \"machine-approver-56656f9798-9rmhw\" (UID: \"7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419039 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/818b6964-1c62-4e2e-8079-a41f9bdcb763-config\") pod \"machine-api-operator-5694c8668f-r5krf\" (UID: \"818b6964-1c62-4e2e-8079-a41f9bdcb763\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419059 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4lcs\" (UniqueName: \"kubernetes.io/projected/818b6964-1c62-4e2e-8079-a41f9bdcb763-kube-api-access-v4lcs\") pod \"machine-api-operator-5694c8668f-r5krf\" (UID: \"818b6964-1c62-4e2e-8079-a41f9bdcb763\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419083 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3532c9be-fdf5-43e2-b5ba-95a678fef5f8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ql5gr\" (UID: \"3532c9be-fdf5-43e2-b5ba-95a678fef5f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ql5gr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419107 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79d58c0b-affd-462b-b4ee-1134ede8bcb5-serving-cert\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419134 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc23ad84-d2b5-4f8b-a110-143219eb78a9-config\") pod \"console-operator-58897d9998-nxjnb\" (UID: \"fc23ad84-d2b5-4f8b-a110-143219eb78a9\") " pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419156 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/98e5f65e-632c-4932-83cc-413ea5cac23a-registry-certificates\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419177 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53f9a23a-b663-4cbf-8c34-334f073e3092-client-ca\") pod \"route-controller-manager-6576b87f9c-9rv9c\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419201 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/59b4fd96-82d8-4cf5-a063-393b6f775e45-etcd-service-ca\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419230 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/818b6964-1c62-4e2e-8079-a41f9bdcb763-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-r5krf\" (UID: \"818b6964-1c62-4e2e-8079-a41f9bdcb763\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419270 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c941b3ea-ef53-47c4-b10a-6e949b7098d2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419313 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gwc4\" (UniqueName: \"kubernetes.io/projected/59b4fd96-82d8-4cf5-a063-393b6f775e45-kube-api-access-5gwc4\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419339 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrr9h\" (UniqueName: \"kubernetes.io/projected/fc23ad84-d2b5-4f8b-a110-143219eb78a9-kube-api-access-nrr9h\") pod \"console-operator-58897d9998-nxjnb\" (UID: \"fc23ad84-d2b5-4f8b-a110-143219eb78a9\") " pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419360 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxzn7\" (UniqueName: \"kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-kube-api-access-hxzn7\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.412717 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419481 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.419498 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9wgfm"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.420646 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9429a50e-b1ff-480d-b8af-d0f095f8cd86-serving-cert\") pod \"controller-manager-879f6c89f-ccqxf\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.421211 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c941b3ea-ef53-47c4-b10a-6e949b7098d2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.421426 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/818b6964-1c62-4e2e-8079-a41f9bdcb763-config\") pod \"machine-api-operator-5694c8668f-r5krf\" (UID: \"818b6964-1c62-4e2e-8079-a41f9bdcb763\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.421679 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c941b3ea-ef53-47c4-b10a-6e949b7098d2-audit-dir\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.413467 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/818b6964-1c62-4e2e-8079-a41f9bdcb763-images\") pod \"machine-api-operator-5694c8668f-r5krf\" (UID: \"818b6964-1c62-4e2e-8079-a41f9bdcb763\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.417670 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0-machine-approver-tls\") pod \"machine-approver-56656f9798-9rmhw\" (UID: \"7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.421970 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ccqxf\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418271 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0-config\") pod \"machine-approver-56656f9798-9rmhw\" (UID: \"7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.418393 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-config\") pod \"controller-manager-879f6c89f-ccqxf\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.417701 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c941b3ea-ef53-47c4-b10a-6e949b7098d2-etcd-client\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.417723 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c941b3ea-ef53-47c4-b10a-6e949b7098d2-serving-cert\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.423220 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-7hngj"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.423463 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.424132 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/818b6964-1c62-4e2e-8079-a41f9bdcb763-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-r5krf\" (UID: \"818b6964-1c62-4e2e-8079-a41f9bdcb763\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.424510 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c941b3ea-ef53-47c4-b10a-6e949b7098d2-encryption-config\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.424759 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.424801 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.424824 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g2wxh"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.424836 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-d6kbr"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.424849 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zhpng"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.424862 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6f79s"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.424875 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.424886 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9hjlq"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.424968 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7hngj" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.426391 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9wgfm"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.427534 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4g9th"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.428066 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4g9th" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.436900 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4g9th"] Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.444826 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.464034 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.483500 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.504844 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.519988 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520356 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/aff0752e-d15d-4137-a5a4-ed8c29efbc74-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zhpng\" (UID: \"aff0752e-d15d-4137-a5a4-ed8c29efbc74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520390 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbthh\" (UniqueName: \"kubernetes.io/projected/aea16266-db6e-4bd6-aac2-8dea60e44c25-kube-api-access-nbthh\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520412 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsszc\" (UniqueName: \"kubernetes.io/projected/8bfdca92-a782-4806-a2c0-e54302fd24a4-kube-api-access-bsszc\") pod \"marketplace-operator-79b997595-tr74j\" (UID: \"8bfdca92-a782-4806-a2c0-e54302fd24a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520488 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv86m\" (UniqueName: \"kubernetes.io/projected/79d58c0b-affd-462b-b4ee-1134ede8bcb5-kube-api-access-zv86m\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520519 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3b6ef406-5003-4eb6-bf53-3a195fcface8-csi-data-dir\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520540 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c8430c27-e731-481d-8579-06bd5c157f2c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-d6kbr\" (UID: \"c8430c27-e731-481d-8579-06bd5c157f2c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-d6kbr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520811 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3b6ef406-5003-4eb6-bf53-3a195fcface8-socket-dir\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520849 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3b6ef406-5003-4eb6-bf53-3a195fcface8-mountpoint-dir\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520865 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/264cec36-f420-4db9-ba83-266f78ecb82d-secret-volume\") pod \"collect-profiles-29416080-6dh4c\" (UID: \"264cec36-f420-4db9-ba83-266f78ecb82d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520883 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79d58c0b-affd-462b-b4ee-1134ede8bcb5-audit-dir\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520902 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndxx8\" (UniqueName: \"kubernetes.io/projected/5196711d-0b39-4630-a0bc-d210d210fc4b-kube-api-access-ndxx8\") pod \"packageserver-d55dfcdfc-j55sf\" (UID: \"5196711d-0b39-4630-a0bc-d210d210fc4b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520918 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/79d58c0b-affd-462b-b4ee-1134ede8bcb5-encryption-config\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520935 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3f9ec60a-e0c3-4a0c-8b43-809eb09fb365-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xtrg9\" (UID: \"3f9ec60a-e0c3-4a0c-8b43-809eb09fb365\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520953 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfn84\" (UniqueName: \"kubernetes.io/projected/3532c9be-fdf5-43e2-b5ba-95a678fef5f8-kube-api-access-jfn84\") pod \"control-plane-machine-set-operator-78cbb6b69f-ql5gr\" (UID: \"3532c9be-fdf5-43e2-b5ba-95a678fef5f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ql5gr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520972 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a971a99c-926f-48f4-88d5-9033085cc89b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lnwkb\" (UID: \"a971a99c-926f-48f4-88d5-9033085cc89b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.520990 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/943916ae-78c3-4ff3-8f1b-71c56ad874dd-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jc4pg\" (UID: \"943916ae-78c3-4ff3-8f1b-71c56ad874dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521009 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a914cea-d605-479e-9f9c-97fedfeddaf4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8z2gm\" (UID: \"2a914cea-d605-479e-9f9c-97fedfeddaf4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521028 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32fe940e-dd94-4dd9-921c-fcd99ddccb2a-trusted-ca\") pod \"ingress-operator-5b745b69d9-llkqs\" (UID: \"32fe940e-dd94-4dd9-921c-fcd99ddccb2a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521046 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9-metrics-tls\") pod \"dns-default-6f79s\" (UID: \"52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9\") " pod="openshift-dns/dns-default-6f79s" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521064 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/98e5f65e-632c-4932-83cc-413ea5cac23a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521081 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj295\" (UniqueName: \"kubernetes.io/projected/53f9a23a-b663-4cbf-8c34-334f073e3092-kube-api-access-cj295\") pod \"route-controller-manager-6576b87f9c-9rv9c\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521098 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fd9747b-ba54-4fa6-8849-7447d6683c68-serving-cert\") pod \"authentication-operator-69f744f599-qv6mb\" (UID: \"2fd9747b-ba54-4fa6-8849-7447d6683c68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521115 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5trh8\" (UniqueName: \"kubernetes.io/projected/2a914cea-d605-479e-9f9c-97fedfeddaf4-kube-api-access-5trh8\") pod \"package-server-manager-789f6589d5-8z2gm\" (UID: \"2a914cea-d605-479e-9f9c-97fedfeddaf4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521132 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvqcr\" (UniqueName: \"kubernetes.io/projected/2dd0664e-36e7-48d4-bfbe-76cdf69883b6-kube-api-access-dvqcr\") pod \"downloads-7954f5f757-gw9l6\" (UID: \"2dd0664e-36e7-48d4-bfbe-76cdf69883b6\") " pod="openshift-console/downloads-7954f5f757-gw9l6" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521191 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmc5k\" (UniqueName: \"kubernetes.io/projected/32fe940e-dd94-4dd9-921c-fcd99ddccb2a-kube-api-access-bmc5k\") pod \"ingress-operator-5b745b69d9-llkqs\" (UID: \"32fe940e-dd94-4dd9-921c-fcd99ddccb2a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521251 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbrlm\" (UniqueName: \"kubernetes.io/projected/52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9-kube-api-access-sbrlm\") pod \"dns-default-6f79s\" (UID: \"52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9\") " pod="openshift-dns/dns-default-6f79s" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521269 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/51026683-03f8-44ad-bac5-f290d1eaf13d-certs\") pod \"machine-config-server-7hngj\" (UID: \"51026683-03f8-44ad-bac5-f290d1eaf13d\") " pod="openshift-machine-config-operator/machine-config-server-7hngj" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521304 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55540507-8d49-4b29-8c37-30d340e4eb1b-serving-cert\") pod \"service-ca-operator-777779d784-w8h6p\" (UID: \"55540507-8d49-4b29-8c37-30d340e4eb1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521321 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/98e5f65e-632c-4932-83cc-413ea5cac23a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521341 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53f9a23a-b663-4cbf-8c34-334f073e3092-serving-cert\") pod \"route-controller-manager-6576b87f9c-9rv9c\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521364 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d58c0b-affd-462b-b4ee-1134ede8bcb5-config\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521382 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f9c687ae-84e1-44ed-801d-abbbff13acd9-audit-dir\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521398 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5196711d-0b39-4630-a0bc-d210d210fc4b-apiservice-cert\") pod \"packageserver-d55dfcdfc-j55sf\" (UID: \"5196711d-0b39-4630-a0bc-d210d210fc4b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521413 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz6ps\" (UniqueName: \"kubernetes.io/projected/c8430c27-e731-481d-8579-06bd5c157f2c-kube-api-access-cz6ps\") pod \"multus-admission-controller-857f4d67dd-d6kbr\" (UID: \"c8430c27-e731-481d-8579-06bd5c157f2c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-d6kbr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521434 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98e5f65e-632c-4932-83cc-413ea5cac23a-trusted-ca\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521450 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bfdca92-a782-4806-a2c0-e54302fd24a4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tr74j\" (UID: \"8bfdca92-a782-4806-a2c0-e54302fd24a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521465 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-audit-policies\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521479 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55540507-8d49-4b29-8c37-30d340e4eb1b-config\") pod \"service-ca-operator-777779d784-w8h6p\" (UID: \"55540507-8d49-4b29-8c37-30d340e4eb1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521493 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3b6ef406-5003-4eb6-bf53-3a195fcface8-registration-dir\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521508 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3f9ec60a-e0c3-4a0c-8b43-809eb09fb365-proxy-tls\") pod \"machine-config-operator-74547568cd-xtrg9\" (UID: \"3f9ec60a-e0c3-4a0c-8b43-809eb09fb365\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521524 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r74w7\" (UniqueName: \"kubernetes.io/projected/264cec36-f420-4db9-ba83-266f78ecb82d-kube-api-access-r74w7\") pod \"collect-profiles-29416080-6dh4c\" (UID: \"264cec36-f420-4db9-ba83-266f78ecb82d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521540 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-stats-auth\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521555 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np2f9\" (UniqueName: \"kubernetes.io/projected/492a9a03-8b00-4fc5-aa95-98a11aa090c7-kube-api-access-np2f9\") pod \"catalog-operator-68c6474976-2rvfr\" (UID: \"492a9a03-8b00-4fc5-aa95-98a11aa090c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521572 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-oauth-config\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521587 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-oauth-serving-cert\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521603 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2btxm\" (UniqueName: \"kubernetes.io/projected/f9c687ae-84e1-44ed-801d-abbbff13acd9-kube-api-access-2btxm\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521618 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgpq2\" (UniqueName: \"kubernetes.io/projected/3b5781ba-a2d3-416c-ba05-8ebfb67bba25-kube-api-access-zgpq2\") pod \"ingress-canary-4g9th\" (UID: \"3b5781ba-a2d3-416c-ba05-8ebfb67bba25\") " pod="openshift-ingress-canary/ingress-canary-4g9th" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521631 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/492a9a03-8b00-4fc5-aa95-98a11aa090c7-srv-cert\") pod \"catalog-operator-68c6474976-2rvfr\" (UID: \"492a9a03-8b00-4fc5-aa95-98a11aa090c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521645 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3f9ec60a-e0c3-4a0c-8b43-809eb09fb365-images\") pod \"machine-config-operator-74547568cd-xtrg9\" (UID: \"3f9ec60a-e0c3-4a0c-8b43-809eb09fb365\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521663 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-serving-cert\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521686 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a971a99c-926f-48f4-88d5-9033085cc89b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lnwkb\" (UID: \"a971a99c-926f-48f4-88d5-9033085cc89b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521710 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc23ad84-d2b5-4f8b-a110-143219eb78a9-trusted-ca\") pod \"console-operator-58897d9998-nxjnb\" (UID: \"fc23ad84-d2b5-4f8b-a110-143219eb78a9\") " pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521727 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fd9747b-ba54-4fa6-8849-7447d6683c68-service-ca-bundle\") pod \"authentication-operator-69f744f599-qv6mb\" (UID: \"2fd9747b-ba54-4fa6-8849-7447d6683c68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521742 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79d58c0b-affd-462b-b4ee-1134ede8bcb5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521758 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/492a9a03-8b00-4fc5-aa95-98a11aa090c7-profile-collector-cert\") pod \"catalog-operator-68c6474976-2rvfr\" (UID: \"492a9a03-8b00-4fc5-aa95-98a11aa090c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521773 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521788 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521809 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc23ad84-d2b5-4f8b-a110-143219eb78a9-serving-cert\") pod \"console-operator-58897d9998-nxjnb\" (UID: \"fc23ad84-d2b5-4f8b-a110-143219eb78a9\") " pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521826 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59b4fd96-82d8-4cf5-a063-393b6f775e45-serving-cert\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521841 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/79d58c0b-affd-462b-b4ee-1134ede8bcb5-audit\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521871 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3532c9be-fdf5-43e2-b5ba-95a678fef5f8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ql5gr\" (UID: \"3532c9be-fdf5-43e2-b5ba-95a678fef5f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ql5gr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521887 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zgdt\" (UniqueName: \"kubernetes.io/projected/aff0752e-d15d-4137-a5a4-ed8c29efbc74-kube-api-access-5zgdt\") pod \"openshift-config-operator-7777fb866f-zhpng\" (UID: \"aff0752e-d15d-4137-a5a4-ed8c29efbc74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521908 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32fe940e-dd94-4dd9-921c-fcd99ddccb2a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-llkqs\" (UID: \"32fe940e-dd94-4dd9-921c-fcd99ddccb2a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521922 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521941 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc23ad84-d2b5-4f8b-a110-143219eb78a9-config\") pod \"console-operator-58897d9998-nxjnb\" (UID: \"fc23ad84-d2b5-4f8b-a110-143219eb78a9\") " pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521956 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/15306917-0f1c-4f26-9eda-637d43a32172-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-cz77l\" (UID: \"15306917-0f1c-4f26-9eda-637d43a32172\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521971 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vsg2\" (UniqueName: \"kubernetes.io/projected/a099a621-9515-4776-bc62-12fb0fa62340-kube-api-access-9vsg2\") pod \"service-ca-9c57cc56f-9hjlq\" (UID: \"a099a621-9515-4776-bc62-12fb0fa62340\") " pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.521987 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9np2m\" (UniqueName: \"kubernetes.io/projected/21c94501-58a7-4b02-94aa-2fc8035777e3-kube-api-access-9np2m\") pod \"migrator-59844c95c7-ctwg7\" (UID: \"21c94501-58a7-4b02-94aa-2fc8035777e3\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ctwg7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522002 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3b6ef406-5003-4eb6-bf53-3a195fcface8-plugins-dir\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522016 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522054 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b931ded-d187-4535-b266-0d17996f0b27-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6wx2p\" (UID: \"5b931ded-d187-4535-b266-0d17996f0b27\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522070 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ee820b-1f44-41e2-b44b-b6bb25edb5af-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n7t68\" (UID: \"57ee820b-1f44-41e2-b44b-b6bb25edb5af\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522086 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn2mx\" (UniqueName: \"kubernetes.io/projected/3f9ec60a-e0c3-4a0c-8b43-809eb09fb365-kube-api-access-dn2mx\") pod \"machine-config-operator-74547568cd-xtrg9\" (UID: \"3f9ec60a-e0c3-4a0c-8b43-809eb09fb365\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522103 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522118 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-default-certificate\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522133 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15306917-0f1c-4f26-9eda-637d43a32172-proxy-tls\") pod \"machine-config-controller-84d6567774-cz77l\" (UID: \"15306917-0f1c-4f26-9eda-637d43a32172\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522148 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8bfdca92-a782-4806-a2c0-e54302fd24a4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tr74j\" (UID: \"8bfdca92-a782-4806-a2c0-e54302fd24a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522166 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrr9h\" (UniqueName: \"kubernetes.io/projected/fc23ad84-d2b5-4f8b-a110-143219eb78a9-kube-api-access-nrr9h\") pod \"console-operator-58897d9998-nxjnb\" (UID: \"fc23ad84-d2b5-4f8b-a110-143219eb78a9\") " pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522180 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/79d58c0b-affd-462b-b4ee-1134ede8bcb5-etcd-client\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522194 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5196711d-0b39-4630-a0bc-d210d210fc4b-webhook-cert\") pod \"packageserver-d55dfcdfc-j55sf\" (UID: \"5196711d-0b39-4630-a0bc-d210d210fc4b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522208 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ceb8b187-9126-4d1e-8201-b4d12a0d1e7a-srv-cert\") pod \"olm-operator-6b444d44fb-qfxk4\" (UID: \"ceb8b187-9126-4d1e-8201-b4d12a0d1e7a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522223 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46xnv\" (UniqueName: \"kubernetes.io/projected/55540507-8d49-4b29-8c37-30d340e4eb1b-kube-api-access-46xnv\") pod \"service-ca-operator-777779d784-w8h6p\" (UID: \"55540507-8d49-4b29-8c37-30d340e4eb1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522248 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/79d58c0b-affd-462b-b4ee-1134ede8bcb5-node-pullsecrets\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522265 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/59b4fd96-82d8-4cf5-a063-393b6f775e45-etcd-client\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522281 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/476c0833-0a8f-4824-a7fe-6f28aada483b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h777m\" (UID: \"476c0833-0a8f-4824-a7fe-6f28aada483b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522314 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/79d58c0b-affd-462b-b4ee-1134ede8bcb5-etcd-serving-ca\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522330 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b4fd96-82d8-4cf5-a063-393b6f775e45-config\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522347 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/943916ae-78c3-4ff3-8f1b-71c56ad874dd-config\") pod \"kube-apiserver-operator-766d6c64bb-jc4pg\" (UID: \"943916ae-78c3-4ff3-8f1b-71c56ad874dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522364 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09993f0f-6381-4517-8246-ef1d188bea5c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rjqnz\" (UID: \"09993f0f-6381-4517-8246-ef1d188bea5c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522385 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/79d58c0b-affd-462b-b4ee-1134ede8bcb5-image-import-ca\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522399 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5196711d-0b39-4630-a0bc-d210d210fc4b-tmpfs\") pod \"packageserver-d55dfcdfc-j55sf\" (UID: \"5196711d-0b39-4630-a0bc-d210d210fc4b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522423 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-registry-tls\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522439 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32fe940e-dd94-4dd9-921c-fcd99ddccb2a-metrics-tls\") pod \"ingress-operator-5b745b69d9-llkqs\" (UID: \"32fe940e-dd94-4dd9-921c-fcd99ddccb2a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522454 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9-config-volume\") pod \"dns-default-6f79s\" (UID: \"52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9\") " pod="openshift-dns/dns-default-6f79s" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522469 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzqhv\" (UniqueName: \"kubernetes.io/projected/3b6ef406-5003-4eb6-bf53-3a195fcface8-kube-api-access-mzqhv\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522497 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/361522f8-b0a1-45d2-baa1-9779678fa54f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-48knz\" (UID: \"361522f8-b0a1-45d2-baa1-9779678fa54f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522521 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522538 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522553 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522569 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09993f0f-6381-4517-8246-ef1d188bea5c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rjqnz\" (UID: \"09993f0f-6381-4517-8246-ef1d188bea5c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522586 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a971a99c-926f-48f4-88d5-9033085cc89b-config\") pod \"kube-controller-manager-operator-78b949d7b-lnwkb\" (UID: \"a971a99c-926f-48f4-88d5-9033085cc89b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522604 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b931ded-d187-4535-b266-0d17996f0b27-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6wx2p\" (UID: \"5b931ded-d187-4535-b266-0d17996f0b27\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522620 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2k2q\" (UniqueName: \"kubernetes.io/projected/5b931ded-d187-4535-b266-0d17996f0b27-kube-api-access-v2k2q\") pod \"cluster-image-registry-operator-dc59b4c8b-6wx2p\" (UID: \"5b931ded-d187-4535-b266-0d17996f0b27\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522637 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9aad9ec0-529f-41f0-bbc7-5b16d4346c9b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gslcv\" (UID: \"9aad9ec0-529f-41f0-bbc7-5b16d4346c9b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522651 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/51026683-03f8-44ad-bac5-f290d1eaf13d-node-bootstrap-token\") pod \"machine-config-server-7hngj\" (UID: \"51026683-03f8-44ad-bac5-f290d1eaf13d\") " pod="openshift-machine-config-operator/machine-config-server-7hngj" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522666 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwz8s\" (UniqueName: \"kubernetes.io/projected/15306917-0f1c-4f26-9eda-637d43a32172-kube-api-access-lwz8s\") pod \"machine-config-controller-84d6567774-cz77l\" (UID: \"15306917-0f1c-4f26-9eda-637d43a32172\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522680 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-trusted-ca-bundle\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522701 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fktb\" (UniqueName: \"kubernetes.io/projected/476c0833-0a8f-4824-a7fe-6f28aada483b-kube-api-access-2fktb\") pod \"cluster-samples-operator-665b6dd947-h777m\" (UID: \"476c0833-0a8f-4824-a7fe-6f28aada483b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522718 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-bound-sa-token\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522742 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9f2d8c0-c11b-4910-aa67-5be21f46b32d-metrics-tls\") pod \"dns-operator-744455d44c-g2wxh\" (UID: \"e9f2d8c0-c11b-4910-aa67-5be21f46b32d\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2wxh" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522757 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrkc4\" (UniqueName: \"kubernetes.io/projected/e9f2d8c0-c11b-4910-aa67-5be21f46b32d-kube-api-access-mrkc4\") pod \"dns-operator-744455d44c-g2wxh\" (UID: \"e9f2d8c0-c11b-4910-aa67-5be21f46b32d\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2wxh" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522773 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a099a621-9515-4776-bc62-12fb0fa62340-signing-key\") pod \"service-ca-9c57cc56f-9hjlq\" (UID: \"a099a621-9515-4776-bc62-12fb0fa62340\") " pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522787 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aea16266-db6e-4bd6-aac2-8dea60e44c25-service-ca-bundle\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522804 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fd9747b-ba54-4fa6-8849-7447d6683c68-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qv6mb\" (UID: \"2fd9747b-ba54-4fa6-8849-7447d6683c68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522820 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/361522f8-b0a1-45d2-baa1-9779678fa54f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-48knz\" (UID: \"361522f8-b0a1-45d2-baa1-9779678fa54f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522910 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.522928 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b5781ba-a2d3-416c-ba05-8ebfb67bba25-cert\") pod \"ingress-canary-4g9th\" (UID: \"3b5781ba-a2d3-416c-ba05-8ebfb67bba25\") " pod="openshift-ingress-canary/ingress-canary-4g9th" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523078 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-service-ca\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523116 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53f9a23a-b663-4cbf-8c34-334f073e3092-config\") pod \"route-controller-manager-6576b87f9c-9rv9c\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523197 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523215 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a099a621-9515-4776-bc62-12fb0fa62340-signing-cabundle\") pod \"service-ca-9c57cc56f-9hjlq\" (UID: \"a099a621-9515-4776-bc62-12fb0fa62340\") " pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523233 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tr8h\" (UniqueName: \"kubernetes.io/projected/09993f0f-6381-4517-8246-ef1d188bea5c-kube-api-access-5tr8h\") pod \"openshift-controller-manager-operator-756b6f6bc6-rjqnz\" (UID: \"09993f0f-6381-4517-8246-ef1d188bea5c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523249 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/59b4fd96-82d8-4cf5-a063-393b6f775e45-etcd-ca\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523266 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b931ded-d187-4535-b266-0d17996f0b27-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6wx2p\" (UID: \"5b931ded-d187-4535-b266-0d17996f0b27\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523283 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxlnn\" (UniqueName: \"kubernetes.io/projected/57ee820b-1f44-41e2-b44b-b6bb25edb5af-kube-api-access-bxlnn\") pod \"openshift-apiserver-operator-796bbdcf4f-n7t68\" (UID: \"57ee820b-1f44-41e2-b44b-b6bb25edb5af\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523312 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm5wl\" (UniqueName: \"kubernetes.io/projected/ceb8b187-9126-4d1e-8201-b4d12a0d1e7a-kube-api-access-fm5wl\") pod \"olm-operator-6b444d44fb-qfxk4\" (UID: \"ceb8b187-9126-4d1e-8201-b4d12a0d1e7a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523332 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd9747b-ba54-4fa6-8849-7447d6683c68-config\") pod \"authentication-operator-69f744f599-qv6mb\" (UID: \"2fd9747b-ba54-4fa6-8849-7447d6683c68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523348 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlvh7\" (UniqueName: \"kubernetes.io/projected/2fd9747b-ba54-4fa6-8849-7447d6683c68-kube-api-access-rlvh7\") pod \"authentication-operator-69f744f599-qv6mb\" (UID: \"2fd9747b-ba54-4fa6-8849-7447d6683c68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523364 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79d58c0b-affd-462b-b4ee-1134ede8bcb5-serving-cert\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523379 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aff0752e-d15d-4137-a5a4-ed8c29efbc74-serving-cert\") pod \"openshift-config-operator-7777fb866f-zhpng\" (UID: \"aff0752e-d15d-4137-a5a4-ed8c29efbc74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523394 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2qzb\" (UniqueName: \"kubernetes.io/projected/55e3c0e4-3a89-48b0-a218-f89546c09a5d-kube-api-access-d2qzb\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523410 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ceb8b187-9126-4d1e-8201-b4d12a0d1e7a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qfxk4\" (UID: \"ceb8b187-9126-4d1e-8201-b4d12a0d1e7a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523425 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523447 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhzdk\" (UniqueName: \"kubernetes.io/projected/51026683-03f8-44ad-bac5-f290d1eaf13d-kube-api-access-xhzdk\") pod \"machine-config-server-7hngj\" (UID: \"51026683-03f8-44ad-bac5-f290d1eaf13d\") " pod="openshift-machine-config-operator/machine-config-server-7hngj" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523466 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/98e5f65e-632c-4932-83cc-413ea5cac23a-registry-certificates\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523482 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53f9a23a-b663-4cbf-8c34-334f073e3092-client-ca\") pod \"route-controller-manager-6576b87f9c-9rv9c\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523497 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/361522f8-b0a1-45d2-baa1-9779678fa54f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-48knz\" (UID: \"361522f8-b0a1-45d2-baa1-9779678fa54f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523514 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/264cec36-f420-4db9-ba83-266f78ecb82d-config-volume\") pod \"collect-profiles-29416080-6dh4c\" (UID: \"264cec36-f420-4db9-ba83-266f78ecb82d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523530 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/59b4fd96-82d8-4cf5-a063-393b6f775e45-etcd-service-ca\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523546 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ee820b-1f44-41e2-b44b-b6bb25edb5af-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n7t68\" (UID: \"57ee820b-1f44-41e2-b44b-b6bb25edb5af\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523572 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwbmx\" (UniqueName: \"kubernetes.io/projected/9aad9ec0-529f-41f0-bbc7-5b16d4346c9b-kube-api-access-kwbmx\") pod \"kube-storage-version-migrator-operator-b67b599dd-gslcv\" (UID: \"9aad9ec0-529f-41f0-bbc7-5b16d4346c9b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523596 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aad9ec0-529f-41f0-bbc7-5b16d4346c9b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gslcv\" (UID: \"9aad9ec0-529f-41f0-bbc7-5b16d4346c9b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523614 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/943916ae-78c3-4ff3-8f1b-71c56ad874dd-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jc4pg\" (UID: \"943916ae-78c3-4ff3-8f1b-71c56ad874dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523632 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-metrics-certs\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523649 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-config\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523665 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gwc4\" (UniqueName: \"kubernetes.io/projected/59b4fd96-82d8-4cf5-a063-393b6f775e45-kube-api-access-5gwc4\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523683 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxzn7\" (UniqueName: \"kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-kube-api-access-hxzn7\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: E1205 20:12:45.523869 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.023854566 +0000 UTC m=+136.253665934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.523976 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79d58c0b-affd-462b-b4ee-1134ede8bcb5-audit-dir\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.525479 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/79d58c0b-affd-462b-b4ee-1134ede8bcb5-node-pullsecrets\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.526649 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/98e5f65e-632c-4932-83cc-413ea5cac23a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.526986 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc23ad84-d2b5-4f8b-a110-143219eb78a9-trusted-ca\") pod \"console-operator-58897d9998-nxjnb\" (UID: \"fc23ad84-d2b5-4f8b-a110-143219eb78a9\") " pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.527526 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fd9747b-ba54-4fa6-8849-7447d6683c68-service-ca-bundle\") pod \"authentication-operator-69f744f599-qv6mb\" (UID: \"2fd9747b-ba54-4fa6-8849-7447d6683c68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.527732 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/79d58c0b-affd-462b-b4ee-1134ede8bcb5-audit\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.527980 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d58c0b-affd-462b-b4ee-1134ede8bcb5-config\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.528094 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/79d58c0b-affd-462b-b4ee-1134ede8bcb5-etcd-serving-ca\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.528651 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09993f0f-6381-4517-8246-ef1d188bea5c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rjqnz\" (UID: \"09993f0f-6381-4517-8246-ef1d188bea5c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.529010 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/79d58c0b-affd-462b-b4ee-1134ede8bcb5-encryption-config\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.529120 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b4fd96-82d8-4cf5-a063-393b6f775e45-config\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.529655 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5196711d-0b39-4630-a0bc-d210d210fc4b-tmpfs\") pod \"packageserver-d55dfcdfc-j55sf\" (UID: \"5196711d-0b39-4630-a0bc-d210d210fc4b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.529746 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79d58c0b-affd-462b-b4ee-1134ede8bcb5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.530275 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53f9a23a-b663-4cbf-8c34-334f073e3092-config\") pod \"route-controller-manager-6576b87f9c-9rv9c\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.530522 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5196711d-0b39-4630-a0bc-d210d210fc4b-apiservice-cert\") pod \"packageserver-d55dfcdfc-j55sf\" (UID: \"5196711d-0b39-4630-a0bc-d210d210fc4b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.531160 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a971a99c-926f-48f4-88d5-9033085cc89b-config\") pod \"kube-controller-manager-operator-78b949d7b-lnwkb\" (UID: \"a971a99c-926f-48f4-88d5-9033085cc89b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.531613 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/59b4fd96-82d8-4cf5-a063-393b6f775e45-etcd-client\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.532147 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/79d58c0b-affd-462b-b4ee-1134ede8bcb5-image-import-ca\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.532203 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53f9a23a-b663-4cbf-8c34-334f073e3092-client-ca\") pod \"route-controller-manager-6576b87f9c-9rv9c\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.532248 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5196711d-0b39-4630-a0bc-d210d210fc4b-webhook-cert\") pod \"packageserver-d55dfcdfc-j55sf\" (UID: \"5196711d-0b39-4630-a0bc-d210d210fc4b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.532572 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/98e5f65e-632c-4932-83cc-413ea5cac23a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.532827 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fd9747b-ba54-4fa6-8849-7447d6683c68-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qv6mb\" (UID: \"2fd9747b-ba54-4fa6-8849-7447d6683c68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.533025 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/59b4fd96-82d8-4cf5-a063-393b6f775e45-etcd-service-ca\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.533129 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/98e5f65e-632c-4932-83cc-413ea5cac23a-registry-certificates\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.533339 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd9747b-ba54-4fa6-8849-7447d6683c68-config\") pod \"authentication-operator-69f744f599-qv6mb\" (UID: \"2fd9747b-ba54-4fa6-8849-7447d6683c68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.533613 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc23ad84-d2b5-4f8b-a110-143219eb78a9-config\") pod \"console-operator-58897d9998-nxjnb\" (UID: \"fc23ad84-d2b5-4f8b-a110-143219eb78a9\") " pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.533832 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/59b4fd96-82d8-4cf5-a063-393b6f775e45-etcd-ca\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.533937 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59b4fd96-82d8-4cf5-a063-393b6f775e45-serving-cert\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.534345 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/476c0833-0a8f-4824-a7fe-6f28aada483b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h777m\" (UID: \"476c0833-0a8f-4824-a7fe-6f28aada483b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.534426 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/79d58c0b-affd-462b-b4ee-1134ede8bcb5-etcd-client\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.534963 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98e5f65e-632c-4932-83cc-413ea5cac23a-trusted-ca\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.535153 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc23ad84-d2b5-4f8b-a110-143219eb78a9-serving-cert\") pod \"console-operator-58897d9998-nxjnb\" (UID: \"fc23ad84-d2b5-4f8b-a110-143219eb78a9\") " pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.535339 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-registry-tls\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.535766 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.536306 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a971a99c-926f-48f4-88d5-9033085cc89b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lnwkb\" (UID: \"a971a99c-926f-48f4-88d5-9033085cc89b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.536655 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3532c9be-fdf5-43e2-b5ba-95a678fef5f8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ql5gr\" (UID: \"3532c9be-fdf5-43e2-b5ba-95a678fef5f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ql5gr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.536808 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79d58c0b-affd-462b-b4ee-1134ede8bcb5-serving-cert\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.540044 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fd9747b-ba54-4fa6-8849-7447d6683c68-serving-cert\") pod \"authentication-operator-69f744f599-qv6mb\" (UID: \"2fd9747b-ba54-4fa6-8849-7447d6683c68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.540226 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09993f0f-6381-4517-8246-ef1d188bea5c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rjqnz\" (UID: \"09993f0f-6381-4517-8246-ef1d188bea5c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.545017 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.560216 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53f9a23a-b663-4cbf-8c34-334f073e3092-serving-cert\") pod \"route-controller-manager-6576b87f9c-9rv9c\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.564220 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.584404 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.604440 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624192 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5trh8\" (UniqueName: \"kubernetes.io/projected/2a914cea-d605-479e-9f9c-97fedfeddaf4-kube-api-access-5trh8\") pod \"package-server-manager-789f6589d5-8z2gm\" (UID: \"2a914cea-d605-479e-9f9c-97fedfeddaf4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624224 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmc5k\" (UniqueName: \"kubernetes.io/projected/32fe940e-dd94-4dd9-921c-fcd99ddccb2a-kube-api-access-bmc5k\") pod \"ingress-operator-5b745b69d9-llkqs\" (UID: \"32fe940e-dd94-4dd9-921c-fcd99ddccb2a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624243 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbrlm\" (UniqueName: \"kubernetes.io/projected/52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9-kube-api-access-sbrlm\") pod \"dns-default-6f79s\" (UID: \"52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9\") " pod="openshift-dns/dns-default-6f79s" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624259 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/51026683-03f8-44ad-bac5-f290d1eaf13d-certs\") pod \"machine-config-server-7hngj\" (UID: \"51026683-03f8-44ad-bac5-f290d1eaf13d\") " pod="openshift-machine-config-operator/machine-config-server-7hngj" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624274 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55540507-8d49-4b29-8c37-30d340e4eb1b-serving-cert\") pod \"service-ca-operator-777779d784-w8h6p\" (UID: \"55540507-8d49-4b29-8c37-30d340e4eb1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624306 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz6ps\" (UniqueName: \"kubernetes.io/projected/c8430c27-e731-481d-8579-06bd5c157f2c-kube-api-access-cz6ps\") pod \"multus-admission-controller-857f4d67dd-d6kbr\" (UID: \"c8430c27-e731-481d-8579-06bd5c157f2c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-d6kbr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624324 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f9c687ae-84e1-44ed-801d-abbbff13acd9-audit-dir\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624344 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bfdca92-a782-4806-a2c0-e54302fd24a4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tr74j\" (UID: \"8bfdca92-a782-4806-a2c0-e54302fd24a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624360 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-audit-policies\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624375 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55540507-8d49-4b29-8c37-30d340e4eb1b-config\") pod \"service-ca-operator-777779d784-w8h6p\" (UID: \"55540507-8d49-4b29-8c37-30d340e4eb1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624389 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3b6ef406-5003-4eb6-bf53-3a195fcface8-registration-dir\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624405 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3f9ec60a-e0c3-4a0c-8b43-809eb09fb365-proxy-tls\") pod \"machine-config-operator-74547568cd-xtrg9\" (UID: \"3f9ec60a-e0c3-4a0c-8b43-809eb09fb365\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624454 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f9c687ae-84e1-44ed-801d-abbbff13acd9-audit-dir\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624472 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r74w7\" (UniqueName: \"kubernetes.io/projected/264cec36-f420-4db9-ba83-266f78ecb82d-kube-api-access-r74w7\") pod \"collect-profiles-29416080-6dh4c\" (UID: \"264cec36-f420-4db9-ba83-266f78ecb82d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624579 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-stats-auth\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624605 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np2f9\" (UniqueName: \"kubernetes.io/projected/492a9a03-8b00-4fc5-aa95-98a11aa090c7-kube-api-access-np2f9\") pod \"catalog-operator-68c6474976-2rvfr\" (UID: \"492a9a03-8b00-4fc5-aa95-98a11aa090c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624607 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624623 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-oauth-config\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624642 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-oauth-serving-cert\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624659 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgpq2\" (UniqueName: \"kubernetes.io/projected/3b5781ba-a2d3-416c-ba05-8ebfb67bba25-kube-api-access-zgpq2\") pod \"ingress-canary-4g9th\" (UID: \"3b5781ba-a2d3-416c-ba05-8ebfb67bba25\") " pod="openshift-ingress-canary/ingress-canary-4g9th" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624674 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2btxm\" (UniqueName: \"kubernetes.io/projected/f9c687ae-84e1-44ed-801d-abbbff13acd9-kube-api-access-2btxm\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624699 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/492a9a03-8b00-4fc5-aa95-98a11aa090c7-srv-cert\") pod \"catalog-operator-68c6474976-2rvfr\" (UID: \"492a9a03-8b00-4fc5-aa95-98a11aa090c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624714 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3f9ec60a-e0c3-4a0c-8b43-809eb09fb365-images\") pod \"machine-config-operator-74547568cd-xtrg9\" (UID: \"3f9ec60a-e0c3-4a0c-8b43-809eb09fb365\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624730 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-serving-cert\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624747 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/492a9a03-8b00-4fc5-aa95-98a11aa090c7-profile-collector-cert\") pod \"catalog-operator-68c6474976-2rvfr\" (UID: \"492a9a03-8b00-4fc5-aa95-98a11aa090c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.624749 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3b6ef406-5003-4eb6-bf53-3a195fcface8-registration-dir\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625087 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-audit-policies\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625318 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625508 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625593 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zgdt\" (UniqueName: \"kubernetes.io/projected/aff0752e-d15d-4137-a5a4-ed8c29efbc74-kube-api-access-5zgdt\") pod \"openshift-config-operator-7777fb866f-zhpng\" (UID: \"aff0752e-d15d-4137-a5a4-ed8c29efbc74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625627 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32fe940e-dd94-4dd9-921c-fcd99ddccb2a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-llkqs\" (UID: \"32fe940e-dd94-4dd9-921c-fcd99ddccb2a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625653 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/15306917-0f1c-4f26-9eda-637d43a32172-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-cz77l\" (UID: \"15306917-0f1c-4f26-9eda-637d43a32172\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625676 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625700 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vsg2\" (UniqueName: \"kubernetes.io/projected/a099a621-9515-4776-bc62-12fb0fa62340-kube-api-access-9vsg2\") pod \"service-ca-9c57cc56f-9hjlq\" (UID: \"a099a621-9515-4776-bc62-12fb0fa62340\") " pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625724 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9np2m\" (UniqueName: \"kubernetes.io/projected/21c94501-58a7-4b02-94aa-2fc8035777e3-kube-api-access-9np2m\") pod \"migrator-59844c95c7-ctwg7\" (UID: \"21c94501-58a7-4b02-94aa-2fc8035777e3\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ctwg7" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625745 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3b6ef406-5003-4eb6-bf53-3a195fcface8-plugins-dir\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625766 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625789 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b931ded-d187-4535-b266-0d17996f0b27-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6wx2p\" (UID: \"5b931ded-d187-4535-b266-0d17996f0b27\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625816 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ee820b-1f44-41e2-b44b-b6bb25edb5af-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n7t68\" (UID: \"57ee820b-1f44-41e2-b44b-b6bb25edb5af\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625839 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn2mx\" (UniqueName: \"kubernetes.io/projected/3f9ec60a-e0c3-4a0c-8b43-809eb09fb365-kube-api-access-dn2mx\") pod \"machine-config-operator-74547568cd-xtrg9\" (UID: \"3f9ec60a-e0c3-4a0c-8b43-809eb09fb365\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625863 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.625951 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-default-certificate\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626007 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15306917-0f1c-4f26-9eda-637d43a32172-proxy-tls\") pod \"machine-config-controller-84d6567774-cz77l\" (UID: \"15306917-0f1c-4f26-9eda-637d43a32172\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626024 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3b6ef406-5003-4eb6-bf53-3a195fcface8-plugins-dir\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626030 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8bfdca92-a782-4806-a2c0-e54302fd24a4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tr74j\" (UID: \"8bfdca92-a782-4806-a2c0-e54302fd24a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626060 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ceb8b187-9126-4d1e-8201-b4d12a0d1e7a-srv-cert\") pod \"olm-operator-6b444d44fb-qfxk4\" (UID: \"ceb8b187-9126-4d1e-8201-b4d12a0d1e7a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626083 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46xnv\" (UniqueName: \"kubernetes.io/projected/55540507-8d49-4b29-8c37-30d340e4eb1b-kube-api-access-46xnv\") pod \"service-ca-operator-777779d784-w8h6p\" (UID: \"55540507-8d49-4b29-8c37-30d340e4eb1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626109 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/943916ae-78c3-4ff3-8f1b-71c56ad874dd-config\") pod \"kube-apiserver-operator-766d6c64bb-jc4pg\" (UID: \"943916ae-78c3-4ff3-8f1b-71c56ad874dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626142 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32fe940e-dd94-4dd9-921c-fcd99ddccb2a-metrics-tls\") pod \"ingress-operator-5b745b69d9-llkqs\" (UID: \"32fe940e-dd94-4dd9-921c-fcd99ddccb2a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626163 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9-config-volume\") pod \"dns-default-6f79s\" (UID: \"52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9\") " pod="openshift-dns/dns-default-6f79s" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626187 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzqhv\" (UniqueName: \"kubernetes.io/projected/3b6ef406-5003-4eb6-bf53-3a195fcface8-kube-api-access-mzqhv\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626219 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/361522f8-b0a1-45d2-baa1-9779678fa54f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-48knz\" (UID: \"361522f8-b0a1-45d2-baa1-9779678fa54f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626242 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626266 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626287 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626324 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626343 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b931ded-d187-4535-b266-0d17996f0b27-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6wx2p\" (UID: \"5b931ded-d187-4535-b266-0d17996f0b27\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626367 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2k2q\" (UniqueName: \"kubernetes.io/projected/5b931ded-d187-4535-b266-0d17996f0b27-kube-api-access-v2k2q\") pod \"cluster-image-registry-operator-dc59b4c8b-6wx2p\" (UID: \"5b931ded-d187-4535-b266-0d17996f0b27\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626394 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9aad9ec0-529f-41f0-bbc7-5b16d4346c9b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gslcv\" (UID: \"9aad9ec0-529f-41f0-bbc7-5b16d4346c9b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626415 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/51026683-03f8-44ad-bac5-f290d1eaf13d-node-bootstrap-token\") pod \"machine-config-server-7hngj\" (UID: \"51026683-03f8-44ad-bac5-f290d1eaf13d\") " pod="openshift-machine-config-operator/machine-config-server-7hngj" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626439 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwz8s\" (UniqueName: \"kubernetes.io/projected/15306917-0f1c-4f26-9eda-637d43a32172-kube-api-access-lwz8s\") pod \"machine-config-controller-84d6567774-cz77l\" (UID: \"15306917-0f1c-4f26-9eda-637d43a32172\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626473 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-trusted-ca-bundle\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626588 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a099a621-9515-4776-bc62-12fb0fa62340-signing-key\") pod \"service-ca-9c57cc56f-9hjlq\" (UID: \"a099a621-9515-4776-bc62-12fb0fa62340\") " pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626619 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9f2d8c0-c11b-4910-aa67-5be21f46b32d-metrics-tls\") pod \"dns-operator-744455d44c-g2wxh\" (UID: \"e9f2d8c0-c11b-4910-aa67-5be21f46b32d\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2wxh" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626662 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrkc4\" (UniqueName: \"kubernetes.io/projected/e9f2d8c0-c11b-4910-aa67-5be21f46b32d-kube-api-access-mrkc4\") pod \"dns-operator-744455d44c-g2wxh\" (UID: \"e9f2d8c0-c11b-4910-aa67-5be21f46b32d\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2wxh" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626693 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aea16266-db6e-4bd6-aac2-8dea60e44c25-service-ca-bundle\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626717 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626720 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/15306917-0f1c-4f26-9eda-637d43a32172-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-cz77l\" (UID: \"15306917-0f1c-4f26-9eda-637d43a32172\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626719 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/361522f8-b0a1-45d2-baa1-9779678fa54f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-48knz\" (UID: \"361522f8-b0a1-45d2-baa1-9779678fa54f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626770 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626788 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b5781ba-a2d3-416c-ba05-8ebfb67bba25-cert\") pod \"ingress-canary-4g9th\" (UID: \"3b5781ba-a2d3-416c-ba05-8ebfb67bba25\") " pod="openshift-ingress-canary/ingress-canary-4g9th" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626806 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-service-ca\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626835 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626856 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a099a621-9515-4776-bc62-12fb0fa62340-signing-cabundle\") pod \"service-ca-9c57cc56f-9hjlq\" (UID: \"a099a621-9515-4776-bc62-12fb0fa62340\") " pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626884 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b931ded-d187-4535-b266-0d17996f0b27-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6wx2p\" (UID: \"5b931ded-d187-4535-b266-0d17996f0b27\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626903 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxlnn\" (UniqueName: \"kubernetes.io/projected/57ee820b-1f44-41e2-b44b-b6bb25edb5af-kube-api-access-bxlnn\") pod \"openshift-apiserver-operator-796bbdcf4f-n7t68\" (UID: \"57ee820b-1f44-41e2-b44b-b6bb25edb5af\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626926 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm5wl\" (UniqueName: \"kubernetes.io/projected/ceb8b187-9126-4d1e-8201-b4d12a0d1e7a-kube-api-access-fm5wl\") pod \"olm-operator-6b444d44fb-qfxk4\" (UID: \"ceb8b187-9126-4d1e-8201-b4d12a0d1e7a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626946 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aff0752e-d15d-4137-a5a4-ed8c29efbc74-serving-cert\") pod \"openshift-config-operator-7777fb866f-zhpng\" (UID: \"aff0752e-d15d-4137-a5a4-ed8c29efbc74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626962 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2qzb\" (UniqueName: \"kubernetes.io/projected/55e3c0e4-3a89-48b0-a218-f89546c09a5d-kube-api-access-d2qzb\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.626980 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ceb8b187-9126-4d1e-8201-b4d12a0d1e7a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qfxk4\" (UID: \"ceb8b187-9126-4d1e-8201-b4d12a0d1e7a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627001 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627019 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhzdk\" (UniqueName: \"kubernetes.io/projected/51026683-03f8-44ad-bac5-f290d1eaf13d-kube-api-access-xhzdk\") pod \"machine-config-server-7hngj\" (UID: \"51026683-03f8-44ad-bac5-f290d1eaf13d\") " pod="openshift-machine-config-operator/machine-config-server-7hngj" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627038 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ee820b-1f44-41e2-b44b-b6bb25edb5af-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n7t68\" (UID: \"57ee820b-1f44-41e2-b44b-b6bb25edb5af\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627055 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/361522f8-b0a1-45d2-baa1-9779678fa54f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-48knz\" (UID: \"361522f8-b0a1-45d2-baa1-9779678fa54f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627071 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/264cec36-f420-4db9-ba83-266f78ecb82d-config-volume\") pod \"collect-profiles-29416080-6dh4c\" (UID: \"264cec36-f420-4db9-ba83-266f78ecb82d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627097 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwbmx\" (UniqueName: \"kubernetes.io/projected/9aad9ec0-529f-41f0-bbc7-5b16d4346c9b-kube-api-access-kwbmx\") pod \"kube-storage-version-migrator-operator-b67b599dd-gslcv\" (UID: \"9aad9ec0-529f-41f0-bbc7-5b16d4346c9b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627126 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aad9ec0-529f-41f0-bbc7-5b16d4346c9b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gslcv\" (UID: \"9aad9ec0-529f-41f0-bbc7-5b16d4346c9b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627143 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/943916ae-78c3-4ff3-8f1b-71c56ad874dd-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jc4pg\" (UID: \"943916ae-78c3-4ff3-8f1b-71c56ad874dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627163 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-metrics-certs\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627179 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-config\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627205 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/aff0752e-d15d-4137-a5a4-ed8c29efbc74-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zhpng\" (UID: \"aff0752e-d15d-4137-a5a4-ed8c29efbc74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627223 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbthh\" (UniqueName: \"kubernetes.io/projected/aea16266-db6e-4bd6-aac2-8dea60e44c25-kube-api-access-nbthh\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627240 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsszc\" (UniqueName: \"kubernetes.io/projected/8bfdca92-a782-4806-a2c0-e54302fd24a4-kube-api-access-bsszc\") pod \"marketplace-operator-79b997595-tr74j\" (UID: \"8bfdca92-a782-4806-a2c0-e54302fd24a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627257 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3b6ef406-5003-4eb6-bf53-3a195fcface8-csi-data-dir\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627275 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627307 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c8430c27-e731-481d-8579-06bd5c157f2c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-d6kbr\" (UID: \"c8430c27-e731-481d-8579-06bd5c157f2c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-d6kbr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627340 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3b6ef406-5003-4eb6-bf53-3a195fcface8-socket-dir\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627355 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3b6ef406-5003-4eb6-bf53-3a195fcface8-mountpoint-dir\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627372 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/264cec36-f420-4db9-ba83-266f78ecb82d-secret-volume\") pod \"collect-profiles-29416080-6dh4c\" (UID: \"264cec36-f420-4db9-ba83-266f78ecb82d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627403 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3f9ec60a-e0c3-4a0c-8b43-809eb09fb365-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xtrg9\" (UID: \"3f9ec60a-e0c3-4a0c-8b43-809eb09fb365\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627419 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32fe940e-dd94-4dd9-921c-fcd99ddccb2a-trusted-ca\") pod \"ingress-operator-5b745b69d9-llkqs\" (UID: \"32fe940e-dd94-4dd9-921c-fcd99ddccb2a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627435 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/943916ae-78c3-4ff3-8f1b-71c56ad874dd-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jc4pg\" (UID: \"943916ae-78c3-4ff3-8f1b-71c56ad874dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627453 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a914cea-d605-479e-9f9c-97fedfeddaf4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8z2gm\" (UID: \"2a914cea-d605-479e-9f9c-97fedfeddaf4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.627470 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9-metrics-tls\") pod \"dns-default-6f79s\" (UID: \"52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9\") " pod="openshift-dns/dns-default-6f79s" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.628112 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.628677 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b931ded-d187-4535-b266-0d17996f0b27-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6wx2p\" (UID: \"5b931ded-d187-4535-b266-0d17996f0b27\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.628768 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3b6ef406-5003-4eb6-bf53-3a195fcface8-mountpoint-dir\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.628922 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3b6ef406-5003-4eb6-bf53-3a195fcface8-csi-data-dir\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.629181 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/264cec36-f420-4db9-ba83-266f78ecb82d-config-volume\") pod \"collect-profiles-29416080-6dh4c\" (UID: \"264cec36-f420-4db9-ba83-266f78ecb82d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" Dec 05 20:12:45 crc kubenswrapper[4744]: E1205 20:12:45.629190 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.129176313 +0000 UTC m=+136.358987691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.629446 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/492a9a03-8b00-4fc5-aa95-98a11aa090c7-profile-collector-cert\") pod \"catalog-operator-68c6474976-2rvfr\" (UID: \"492a9a03-8b00-4fc5-aa95-98a11aa090c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.629802 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.629986 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/aff0752e-d15d-4137-a5a4-ed8c29efbc74-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zhpng\" (UID: \"aff0752e-d15d-4137-a5a4-ed8c29efbc74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.630401 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ceb8b187-9126-4d1e-8201-b4d12a0d1e7a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qfxk4\" (UID: \"ceb8b187-9126-4d1e-8201-b4d12a0d1e7a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.630799 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3b6ef406-5003-4eb6-bf53-3a195fcface8-socket-dir\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.630886 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.631320 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3f9ec60a-e0c3-4a0c-8b43-809eb09fb365-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xtrg9\" (UID: \"3f9ec60a-e0c3-4a0c-8b43-809eb09fb365\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.632382 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.633550 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.633939 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.634106 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.634646 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/264cec36-f420-4db9-ba83-266f78ecb82d-secret-volume\") pod \"collect-profiles-29416080-6dh4c\" (UID: \"264cec36-f420-4db9-ba83-266f78ecb82d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.634970 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c8430c27-e731-481d-8579-06bd5c157f2c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-d6kbr\" (UID: \"c8430c27-e731-481d-8579-06bd5c157f2c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-d6kbr" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.636345 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b931ded-d187-4535-b266-0d17996f0b27-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6wx2p\" (UID: \"5b931ded-d187-4535-b266-0d17996f0b27\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.637114 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.644912 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.650904 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.664729 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.684130 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.704542 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.713152 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9aad9ec0-529f-41f0-bbc7-5b16d4346c9b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gslcv\" (UID: \"9aad9ec0-529f-41f0-bbc7-5b16d4346c9b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.724883 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.728423 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:45 crc kubenswrapper[4744]: E1205 20:12:45.728575 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.228550038 +0000 UTC m=+136.458361426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.729532 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: E1205 20:12:45.729806 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.22979266 +0000 UTC m=+136.459604118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.730128 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aad9ec0-529f-41f0-bbc7-5b16d4346c9b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gslcv\" (UID: \"9aad9ec0-529f-41f0-bbc7-5b16d4346c9b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.744815 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.764453 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.785580 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.805648 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.811208 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ee820b-1f44-41e2-b44b-b6bb25edb5af-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n7t68\" (UID: \"57ee820b-1f44-41e2-b44b-b6bb25edb5af\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.824362 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.831727 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:45 crc kubenswrapper[4744]: E1205 20:12:45.831880 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.331859724 +0000 UTC m=+136.561671102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.832448 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: E1205 20:12:45.832794 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.332780567 +0000 UTC m=+136.562591945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.845039 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.849375 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ee820b-1f44-41e2-b44b-b6bb25edb5af-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n7t68\" (UID: \"57ee820b-1f44-41e2-b44b-b6bb25edb5af\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.864355 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.872052 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ceb8b187-9126-4d1e-8201-b4d12a0d1e7a-srv-cert\") pod \"olm-operator-6b444d44fb-qfxk4\" (UID: \"ceb8b187-9126-4d1e-8201-b4d12a0d1e7a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.894763 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.898056 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-trusted-ca-bundle\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.905636 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.916048 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-oauth-serving-cert\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.925152 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.935595 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:45 crc kubenswrapper[4744]: E1205 20:12:45.935862 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.435829267 +0000 UTC m=+136.665640675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.936925 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:45 crc kubenswrapper[4744]: E1205 20:12:45.937456 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.437434908 +0000 UTC m=+136.667246296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.940608 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-oauth-config\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.944934 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.951731 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-config\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.964164 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.967914 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-service-ca\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:45 crc kubenswrapper[4744]: I1205 20:12:45.984458 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.004867 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.010344 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-serving-cert\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.023971 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.037683 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.037881 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.537834449 +0000 UTC m=+136.767645867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.038574 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.039178 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.539157213 +0000 UTC m=+136.768968621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.045707 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.065546 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.084500 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.090999 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/361522f8-b0a1-45d2-baa1-9779678fa54f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-48knz\" (UID: \"361522f8-b0a1-45d2-baa1-9779678fa54f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.105122 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.109502 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/361522f8-b0a1-45d2-baa1-9779678fa54f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-48knz\" (UID: \"361522f8-b0a1-45d2-baa1-9779678fa54f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.125056 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.140700 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.140908 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.640881838 +0000 UTC m=+136.870693236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.141649 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.142126 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.642105209 +0000 UTC m=+136.871916587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.145711 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.153124 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32fe940e-dd94-4dd9-921c-fcd99ddccb2a-metrics-tls\") pod \"ingress-operator-5b745b69d9-llkqs\" (UID: \"32fe940e-dd94-4dd9-921c-fcd99ddccb2a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.175792 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.182493 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32fe940e-dd94-4dd9-921c-fcd99ddccb2a-trusted-ca\") pod \"ingress-operator-5b745b69d9-llkqs\" (UID: \"32fe940e-dd94-4dd9-921c-fcd99ddccb2a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.184495 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.205438 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.215924 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a914cea-d605-479e-9f9c-97fedfeddaf4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8z2gm\" (UID: \"2a914cea-d605-479e-9f9c-97fedfeddaf4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.225438 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.231341 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9f2d8c0-c11b-4910-aa67-5be21f46b32d-metrics-tls\") pod \"dns-operator-744455d44c-g2wxh\" (UID: \"e9f2d8c0-c11b-4910-aa67-5be21f46b32d\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2wxh" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.242512 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.242761 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.742732016 +0000 UTC m=+136.972543414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.243878 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.244716 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.744693456 +0000 UTC m=+136.974504894 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.244867 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.264165 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.284772 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.304840 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.311466 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15306917-0f1c-4f26-9eda-637d43a32172-proxy-tls\") pod \"machine-config-controller-84d6567774-cz77l\" (UID: \"15306917-0f1c-4f26-9eda-637d43a32172\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.324572 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.342741 4744 request.go:700] Waited for 1.004926415s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/configmaps?fieldSelector=metadata.name%3Dmachine-config-operator-images&limit=500&resourceVersion=0 Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.345380 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.345801 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.346034 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.845969191 +0000 UTC m=+137.075780599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.347760 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.348246 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.848231948 +0000 UTC m=+137.078043316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.356013 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3f9ec60a-e0c3-4a0c-8b43-809eb09fb365-images\") pod \"machine-config-operator-74547568cd-xtrg9\" (UID: \"3f9ec60a-e0c3-4a0c-8b43-809eb09fb365\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.364604 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.385240 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.399939 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3f9ec60a-e0c3-4a0c-8b43-809eb09fb365-proxy-tls\") pod \"machine-config-operator-74547568cd-xtrg9\" (UID: \"3f9ec60a-e0c3-4a0c-8b43-809eb09fb365\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.425693 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.444911 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.449817 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.450085 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.950056066 +0000 UTC m=+137.179867474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.450467 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.451155 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:46.951129334 +0000 UTC m=+137.180940732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.465350 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.473882 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8bfdca92-a782-4806-a2c0-e54302fd24a4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tr74j\" (UID: \"8bfdca92-a782-4806-a2c0-e54302fd24a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.494194 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.496130 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bfdca92-a782-4806-a2c0-e54302fd24a4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tr74j\" (UID: \"8bfdca92-a782-4806-a2c0-e54302fd24a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.504251 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.524077 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.531578 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/492a9a03-8b00-4fc5-aa95-98a11aa090c7-srv-cert\") pod \"catalog-operator-68c6474976-2rvfr\" (UID: \"492a9a03-8b00-4fc5-aa95-98a11aa090c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.545050 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.552799 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.553185 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.053152597 +0000 UTC m=+137.282964005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.554091 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.554678 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.054658986 +0000 UTC m=+137.284470384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.564767 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.584746 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.595541 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/943916ae-78c3-4ff3-8f1b-71c56ad874dd-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jc4pg\" (UID: \"943916ae-78c3-4ff3-8f1b-71c56ad874dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.604858 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.607915 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/943916ae-78c3-4ff3-8f1b-71c56ad874dd-config\") pod \"kube-apiserver-operator-766d6c64bb-jc4pg\" (UID: \"943916ae-78c3-4ff3-8f1b-71c56ad874dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.624495 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.624900 4744 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.624948 4744 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.624996 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55540507-8d49-4b29-8c37-30d340e4eb1b-serving-cert podName:55540507-8d49-4b29-8c37-30d340e4eb1b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.124971393 +0000 UTC m=+137.354782801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/55540507-8d49-4b29-8c37-30d340e4eb1b-serving-cert") pod "service-ca-operator-777779d784-w8h6p" (UID: "55540507-8d49-4b29-8c37-30d340e4eb1b") : failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.625054 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-stats-auth podName:aea16266-db6e-4bd6-aac2-8dea60e44c25 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.125027104 +0000 UTC m=+137.354838512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-stats-auth") pod "router-default-5444994796-lk6bf" (UID: "aea16266-db6e-4bd6-aac2-8dea60e44c25") : failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.625134 4744 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.625137 4744 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.625185 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51026683-03f8-44ad-bac5-f290d1eaf13d-certs podName:51026683-03f8-44ad-bac5-f290d1eaf13d nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.125169838 +0000 UTC m=+137.354981246 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/51026683-03f8-44ad-bac5-f290d1eaf13d-certs") pod "machine-config-server-7hngj" (UID: "51026683-03f8-44ad-bac5-f290d1eaf13d") : failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.625210 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55540507-8d49-4b29-8c37-30d340e4eb1b-config podName:55540507-8d49-4b29-8c37-30d340e4eb1b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.125199019 +0000 UTC m=+137.355010417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/55540507-8d49-4b29-8c37-30d340e4eb1b-config") pod "service-ca-operator-777779d784-w8h6p" (UID: "55540507-8d49-4b29-8c37-30d340e4eb1b") : failed to sync configmap cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.626593 4744 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.626680 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-default-certificate podName:aea16266-db6e-4bd6-aac2-8dea60e44c25 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.126660506 +0000 UTC m=+137.356471904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-default-certificate") pod "router-default-5444994796-lk6bf" (UID: "aea16266-db6e-4bd6-aac2-8dea60e44c25") : failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.627024 4744 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.627218 4744 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.627450 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b5781ba-a2d3-416c-ba05-8ebfb67bba25-cert podName:3b5781ba-a2d3-416c-ba05-8ebfb67bba25 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.127428076 +0000 UTC m=+137.357239484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b5781ba-a2d3-416c-ba05-8ebfb67bba25-cert") pod "ingress-canary-4g9th" (UID: "3b5781ba-a2d3-416c-ba05-8ebfb67bba25") : failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.627117 4744 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.627172 4744 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.627588 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a099a621-9515-4776-bc62-12fb0fa62340-signing-cabundle podName:a099a621-9515-4776-bc62-12fb0fa62340 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.12756688 +0000 UTC m=+137.357378308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/a099a621-9515-4776-bc62-12fb0fa62340-signing-cabundle") pod "service-ca-9c57cc56f-9hjlq" (UID: "a099a621-9515-4776-bc62-12fb0fa62340") : failed to sync configmap cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.627618 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9-config-volume podName:52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.127604441 +0000 UTC m=+137.357415839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9-config-volume") pod "dns-default-6f79s" (UID: "52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9") : failed to sync configmap cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.627249 4744 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.627653 4744 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.627712 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51026683-03f8-44ad-bac5-f290d1eaf13d-node-bootstrap-token podName:51026683-03f8-44ad-bac5-f290d1eaf13d nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.127697703 +0000 UTC m=+137.357509211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/51026683-03f8-44ad-bac5-f290d1eaf13d-node-bootstrap-token") pod "machine-config-server-7hngj" (UID: "51026683-03f8-44ad-bac5-f290d1eaf13d") : failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.627741 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9-metrics-tls podName:52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.127724384 +0000 UTC m=+137.357535782 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9-metrics-tls") pod "dns-default-6f79s" (UID: "52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9") : failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.627161 4744 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.627797 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aea16266-db6e-4bd6-aac2-8dea60e44c25-service-ca-bundle podName:aea16266-db6e-4bd6-aac2-8dea60e44c25 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.127785845 +0000 UTC m=+137.357597253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/aea16266-db6e-4bd6-aac2-8dea60e44c25-service-ca-bundle") pod "router-default-5444994796-lk6bf" (UID: "aea16266-db6e-4bd6-aac2-8dea60e44c25") : failed to sync configmap cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.628662 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a099a621-9515-4776-bc62-12fb0fa62340-signing-key podName:a099a621-9515-4776-bc62-12fb0fa62340 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.128634018 +0000 UTC m=+137.358445466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/a099a621-9515-4776-bc62-12fb0fa62340-signing-key") pod "service-ca-9c57cc56f-9hjlq" (UID: "a099a621-9515-4776-bc62-12fb0fa62340") : failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.629796 4744 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.629876 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-metrics-certs podName:aea16266-db6e-4bd6-aac2-8dea60e44c25 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.129860219 +0000 UTC m=+137.359671627 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-metrics-certs") pod "router-default-5444994796-lk6bf" (UID: "aea16266-db6e-4bd6-aac2-8dea60e44c25") : failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.630930 4744 secret.go:188] Couldn't get secret openshift-config-operator/config-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.631016 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aff0752e-d15d-4137-a5a4-ed8c29efbc74-serving-cert podName:aff0752e-d15d-4137-a5a4-ed8c29efbc74 nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.130998338 +0000 UTC m=+137.360809746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/aff0752e-d15d-4137-a5a4-ed8c29efbc74-serving-cert") pod "openshift-config-operator-7777fb866f-zhpng" (UID: "aff0752e-d15d-4137-a5a4-ed8c29efbc74") : failed to sync secret cache: timed out waiting for the condition Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.644428 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.655407 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.655634 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.155607501 +0000 UTC m=+137.385418909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.656968 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.657554 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.1575296 +0000 UTC m=+137.387341058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.664576 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.685127 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.704463 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.724547 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.745539 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.758899 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.759037 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.259014709 +0000 UTC m=+137.488826087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.759735 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.760590 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.260561179 +0000 UTC m=+137.490372577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.766851 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.785576 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.804549 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.824351 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.845411 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.861524 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.861711 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.361674798 +0000 UTC m=+137.591486186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.862628 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.863066 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.363051854 +0000 UTC m=+137.592863352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.864751 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.885027 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.905160 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.924889 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.944920 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.964059 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.964217 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.464193914 +0000 UTC m=+137.694005292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.964257 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.964631 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:46 crc kubenswrapper[4744]: E1205 20:12:46.965180 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.465166439 +0000 UTC m=+137.694977817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:46 crc kubenswrapper[4744]: I1205 20:12:46.984590 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.005156 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.025522 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.045322 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.064507 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.066057 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.066650 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.566617446 +0000 UTC m=+137.796428854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.066938 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.067490 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.567465429 +0000 UTC m=+137.797276837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.085917 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.129452 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk56f\" (UniqueName: \"kubernetes.io/projected/c941b3ea-ef53-47c4-b10a-6e949b7098d2-kube-api-access-xk56f\") pod \"apiserver-7bbb656c7d-xtnpd\" (UID: \"c941b3ea-ef53-47c4-b10a-6e949b7098d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.152022 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h44h4\" (UniqueName: \"kubernetes.io/projected/9429a50e-b1ff-480d-b8af-d0f095f8cd86-kube-api-access-h44h4\") pod \"controller-manager-879f6c89f-ccqxf\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.162709 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4lcs\" (UniqueName: \"kubernetes.io/projected/818b6964-1c62-4e2e-8079-a41f9bdcb763-kube-api-access-v4lcs\") pod \"machine-api-operator-5694c8668f-r5krf\" (UID: \"818b6964-1c62-4e2e-8079-a41f9bdcb763\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.168828 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.169148 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.669116392 +0000 UTC m=+137.898927770 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.169323 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aff0752e-d15d-4137-a5a4-ed8c29efbc74-serving-cert\") pod \"openshift-config-operator-7777fb866f-zhpng\" (UID: \"aff0752e-d15d-4137-a5a4-ed8c29efbc74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.169545 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-metrics-certs\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.169716 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.169866 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9-metrics-tls\") pod \"dns-default-6f79s\" (UID: \"52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9\") " pod="openshift-dns/dns-default-6f79s" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.170014 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/51026683-03f8-44ad-bac5-f290d1eaf13d-certs\") pod \"machine-config-server-7hngj\" (UID: \"51026683-03f8-44ad-bac5-f290d1eaf13d\") " pod="openshift-machine-config-operator/machine-config-server-7hngj" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.170112 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55540507-8d49-4b29-8c37-30d340e4eb1b-serving-cert\") pod \"service-ca-operator-777779d784-w8h6p\" (UID: \"55540507-8d49-4b29-8c37-30d340e4eb1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.170263 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55540507-8d49-4b29-8c37-30d340e4eb1b-config\") pod \"service-ca-operator-777779d784-w8h6p\" (UID: \"55540507-8d49-4b29-8c37-30d340e4eb1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.170543 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-stats-auth\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.170872 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-default-certificate\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.171045 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9-config-volume\") pod \"dns-default-6f79s\" (UID: \"52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9\") " pod="openshift-dns/dns-default-6f79s" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.171197 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/51026683-03f8-44ad-bac5-f290d1eaf13d-node-bootstrap-token\") pod \"machine-config-server-7hngj\" (UID: \"51026683-03f8-44ad-bac5-f290d1eaf13d\") " pod="openshift-machine-config-operator/machine-config-server-7hngj" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.171357 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a099a621-9515-4776-bc62-12fb0fa62340-signing-key\") pod \"service-ca-9c57cc56f-9hjlq\" (UID: \"a099a621-9515-4776-bc62-12fb0fa62340\") " pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.171525 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aea16266-db6e-4bd6-aac2-8dea60e44c25-service-ca-bundle\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.171650 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b5781ba-a2d3-416c-ba05-8ebfb67bba25-cert\") pod \"ingress-canary-4g9th\" (UID: \"3b5781ba-a2d3-416c-ba05-8ebfb67bba25\") " pod="openshift-ingress-canary/ingress-canary-4g9th" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.171770 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a099a621-9515-4776-bc62-12fb0fa62340-signing-cabundle\") pod \"service-ca-9c57cc56f-9hjlq\" (UID: \"a099a621-9515-4776-bc62-12fb0fa62340\") " pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.172931 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a099a621-9515-4776-bc62-12fb0fa62340-signing-cabundle\") pod \"service-ca-9c57cc56f-9hjlq\" (UID: \"a099a621-9515-4776-bc62-12fb0fa62340\") " pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.173067 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55540507-8d49-4b29-8c37-30d340e4eb1b-config\") pod \"service-ca-operator-777779d784-w8h6p\" (UID: \"55540507-8d49-4b29-8c37-30d340e4eb1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.173843 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9-config-volume\") pod \"dns-default-6f79s\" (UID: \"52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9\") " pod="openshift-dns/dns-default-6f79s" Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.174213 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.674187442 +0000 UTC m=+137.903998850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.174529 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9-metrics-tls\") pod \"dns-default-6f79s\" (UID: \"52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9\") " pod="openshift-dns/dns-default-6f79s" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.175318 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aea16266-db6e-4bd6-aac2-8dea60e44c25-service-ca-bundle\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.178607 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55540507-8d49-4b29-8c37-30d340e4eb1b-serving-cert\") pod \"service-ca-operator-777779d784-w8h6p\" (UID: \"55540507-8d49-4b29-8c37-30d340e4eb1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.179048 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aff0752e-d15d-4137-a5a4-ed8c29efbc74-serving-cert\") pod \"openshift-config-operator-7777fb866f-zhpng\" (UID: \"aff0752e-d15d-4137-a5a4-ed8c29efbc74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.180873 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a099a621-9515-4776-bc62-12fb0fa62340-signing-key\") pod \"service-ca-9c57cc56f-9hjlq\" (UID: \"a099a621-9515-4776-bc62-12fb0fa62340\") " pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.182824 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-metrics-certs\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.184107 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-stats-auth\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.184203 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/aea16266-db6e-4bd6-aac2-8dea60e44c25-default-certificate\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.185188 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.189653 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sd82\" (UniqueName: \"kubernetes.io/projected/7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0-kube-api-access-4sd82\") pod \"machine-approver-56656f9798-9rmhw\" (UID: \"7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.205043 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.225501 4744 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.245688 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.262556 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/51026683-03f8-44ad-bac5-f290d1eaf13d-certs\") pod \"machine-config-server-7hngj\" (UID: \"51026683-03f8-44ad-bac5-f290d1eaf13d\") " pod="openshift-machine-config-operator/machine-config-server-7hngj" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.265582 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.272654 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.272871 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.772832689 +0000 UTC m=+138.002644087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.273207 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.273728 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.773706821 +0000 UTC m=+138.003518219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.285273 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.302351 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/51026683-03f8-44ad-bac5-f290d1eaf13d-node-bootstrap-token\") pod \"machine-config-server-7hngj\" (UID: \"51026683-03f8-44ad-bac5-f290d1eaf13d\") " pod="openshift-machine-config-operator/machine-config-server-7hngj" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.305048 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.315558 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.318640 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b5781ba-a2d3-416c-ba05-8ebfb67bba25-cert\") pod \"ingress-canary-4g9th\" (UID: \"3b5781ba-a2d3-416c-ba05-8ebfb67bba25\") " pod="openshift-ingress-canary/ingress-canary-4g9th" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.325176 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 20:12:47 crc kubenswrapper[4744]: W1205 20:12:47.338167 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e9b7529_8e9b_4ecc_9fd0_c43e9de6edb0.slice/crio-e7dd108f12c84d44d5d0246f46203838288bf17c88e775c9d23d795985198c2b WatchSource:0}: Error finding container e7dd108f12c84d44d5d0246f46203838288bf17c88e775c9d23d795985198c2b: Status 404 returned error can't find the container with id e7dd108f12c84d44d5d0246f46203838288bf17c88e775c9d23d795985198c2b Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.345009 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.363000 4744 request.go:700] Waited for 1.934681277s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.364889 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.365192 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.375464 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.376248 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.876211327 +0000 UTC m=+138.106022735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.376515 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.376951 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.876929164 +0000 UTC m=+138.106740572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.412153 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.414722 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxzn7\" (UniqueName: \"kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-kube-api-access-hxzn7\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.421022 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.423935 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv86m\" (UniqueName: \"kubernetes.io/projected/79d58c0b-affd-462b-b4ee-1134ede8bcb5-kube-api-access-zv86m\") pod \"apiserver-76f77b778f-htzxr\" (UID: \"79d58c0b-affd-462b-b4ee-1134ede8bcb5\") " pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.443657 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndxx8\" (UniqueName: \"kubernetes.io/projected/5196711d-0b39-4630-a0bc-d210d210fc4b-kube-api-access-ndxx8\") pod \"packageserver-d55dfcdfc-j55sf\" (UID: \"5196711d-0b39-4630-a0bc-d210d210fc4b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.460851 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfn84\" (UniqueName: \"kubernetes.io/projected/3532c9be-fdf5-43e2-b5ba-95a678fef5f8-kube-api-access-jfn84\") pod \"control-plane-machine-set-operator-78cbb6b69f-ql5gr\" (UID: \"3532c9be-fdf5-43e2-b5ba-95a678fef5f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ql5gr" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.481737 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.481917 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.981886423 +0000 UTC m=+138.211697791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.482461 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.482553 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.483283 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:47.983232437 +0000 UTC m=+138.213043845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.485373 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a971a99c-926f-48f4-88d5-9033085cc89b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lnwkb\" (UID: \"a971a99c-926f-48f4-88d5-9033085cc89b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.505561 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj295\" (UniqueName: \"kubernetes.io/projected/53f9a23a-b663-4cbf-8c34-334f073e3092-kube-api-access-cj295\") pod \"route-controller-manager-6576b87f9c-9rv9c\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.528081 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvqcr\" (UniqueName: \"kubernetes.io/projected/2dd0664e-36e7-48d4-bfbe-76cdf69883b6-kube-api-access-dvqcr\") pod \"downloads-7954f5f757-gw9l6\" (UID: \"2dd0664e-36e7-48d4-bfbe-76cdf69883b6\") " pod="openshift-console/downloads-7954f5f757-gw9l6" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.548283 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gwc4\" (UniqueName: \"kubernetes.io/projected/59b4fd96-82d8-4cf5-a063-393b6f775e45-kube-api-access-5gwc4\") pod \"etcd-operator-b45778765-l6gl7\" (UID: \"59b4fd96-82d8-4cf5-a063-393b6f775e45\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.554431 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.570932 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.575440 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tr8h\" (UniqueName: \"kubernetes.io/projected/09993f0f-6381-4517-8246-ef1d188bea5c-kube-api-access-5tr8h\") pod \"openshift-controller-manager-operator-756b6f6bc6-rjqnz\" (UID: \"09993f0f-6381-4517-8246-ef1d188bea5c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.579405 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ql5gr" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.580737 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fktb\" (UniqueName: \"kubernetes.io/projected/476c0833-0a8f-4824-a7fe-6f28aada483b-kube-api-access-2fktb\") pod \"cluster-samples-operator-665b6dd947-h777m\" (UID: \"476c0833-0a8f-4824-a7fe-6f28aada483b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.583721 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.583906 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:48.083882435 +0000 UTC m=+138.313693803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.584164 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.584452 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:48.084443119 +0000 UTC m=+138.314254487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.599259 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-bound-sa-token\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.620648 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlvh7\" (UniqueName: \"kubernetes.io/projected/2fd9747b-ba54-4fa6-8849-7447d6683c68-kube-api-access-rlvh7\") pod \"authentication-operator-69f744f599-qv6mb\" (UID: \"2fd9747b-ba54-4fa6-8849-7447d6683c68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.639189 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd"] Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.639500 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrr9h\" (UniqueName: \"kubernetes.io/projected/fc23ad84-d2b5-4f8b-a110-143219eb78a9-kube-api-access-nrr9h\") pod \"console-operator-58897d9998-nxjnb\" (UID: \"fc23ad84-d2b5-4f8b-a110-143219eb78a9\") " pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.680407 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-r5krf"] Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.680953 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmc5k\" (UniqueName: \"kubernetes.io/projected/32fe940e-dd94-4dd9-921c-fcd99ddccb2a-kube-api-access-bmc5k\") pod \"ingress-operator-5b745b69d9-llkqs\" (UID: \"32fe940e-dd94-4dd9-921c-fcd99ddccb2a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.686694 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.687124 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:48.187109739 +0000 UTC m=+138.416921107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.701329 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbrlm\" (UniqueName: \"kubernetes.io/projected/52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9-kube-api-access-sbrlm\") pod \"dns-default-6f79s\" (UID: \"52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9\") " pod="openshift-dns/dns-default-6f79s" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.719676 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz6ps\" (UniqueName: \"kubernetes.io/projected/c8430c27-e731-481d-8579-06bd5c157f2c-kube-api-access-cz6ps\") pod \"multus-admission-controller-857f4d67dd-d6kbr\" (UID: \"c8430c27-e731-481d-8579-06bd5c157f2c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-d6kbr" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.733976 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.742192 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5trh8\" (UniqueName: \"kubernetes.io/projected/2a914cea-d605-479e-9f9c-97fedfeddaf4-kube-api-access-5trh8\") pod \"package-server-manager-789f6589d5-8z2gm\" (UID: \"2a914cea-d605-479e-9f9c-97fedfeddaf4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.753375 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.754701 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ccqxf"] Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.761410 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r74w7\" (UniqueName: \"kubernetes.io/projected/264cec36-f420-4db9-ba83-266f78ecb82d-kube-api-access-r74w7\") pod \"collect-profiles-29416080-6dh4c\" (UID: \"264cec36-f420-4db9-ba83-266f78ecb82d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.765656 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.778725 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.783583 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-htzxr"] Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.784046 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np2f9\" (UniqueName: \"kubernetes.io/projected/492a9a03-8b00-4fc5-aa95-98a11aa090c7-kube-api-access-np2f9\") pod \"catalog-operator-68c6474976-2rvfr\" (UID: \"492a9a03-8b00-4fc5-aa95-98a11aa090c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.789533 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.789544 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.789837 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:48.289825099 +0000 UTC m=+138.519636467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.799495 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gw9l6" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.804859 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgpq2\" (UniqueName: \"kubernetes.io/projected/3b5781ba-a2d3-416c-ba05-8ebfb67bba25-kube-api-access-zgpq2\") pod \"ingress-canary-4g9th\" (UID: \"3b5781ba-a2d3-416c-ba05-8ebfb67bba25\") " pod="openshift-ingress-canary/ingress-canary-4g9th" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.818939 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.820129 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2btxm\" (UniqueName: \"kubernetes.io/projected/f9c687ae-84e1-44ed-801d-abbbff13acd9-kube-api-access-2btxm\") pod \"oauth-openshift-558db77b4-dn5pv\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.832369 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb"] Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.836999 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6f79s" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.840654 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zgdt\" (UniqueName: \"kubernetes.io/projected/aff0752e-d15d-4137-a5a4-ed8c29efbc74-kube-api-access-5zgdt\") pod \"openshift-config-operator-7777fb866f-zhpng\" (UID: \"aff0752e-d15d-4137-a5a4-ed8c29efbc74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.856905 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf"] Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.859438 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9np2m\" (UniqueName: \"kubernetes.io/projected/21c94501-58a7-4b02-94aa-2fc8035777e3-kube-api-access-9np2m\") pod \"migrator-59844c95c7-ctwg7\" (UID: \"21c94501-58a7-4b02-94aa-2fc8035777e3\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ctwg7" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.873012 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4g9th" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.876401 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b931ded-d187-4535-b266-0d17996f0b27-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6wx2p\" (UID: \"5b931ded-d187-4535-b266-0d17996f0b27\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.885225 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.889333 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ql5gr"] Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.890241 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.891444 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:48.391425142 +0000 UTC m=+138.621236510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.894769 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-d6kbr" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.898347 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vsg2\" (UniqueName: \"kubernetes.io/projected/a099a621-9515-4776-bc62-12fb0fa62340-kube-api-access-9vsg2\") pod \"service-ca-9c57cc56f-9hjlq\" (UID: \"a099a621-9515-4776-bc62-12fb0fa62340\") " pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.900240 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.907509 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ctwg7" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.911309 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" event={"ID":"c941b3ea-ef53-47c4-b10a-6e949b7098d2","Type":"ContainerStarted","Data":"dae051bc1579ac6d8a8b370d08774eaf70aa41f0b845b3f9ed065d4179a04793"} Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.912567 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" event={"ID":"7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0","Type":"ContainerStarted","Data":"e7dd108f12c84d44d5d0246f46203838288bf17c88e775c9d23d795985198c2b"} Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.917481 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn2mx\" (UniqueName: \"kubernetes.io/projected/3f9ec60a-e0c3-4a0c-8b43-809eb09fb365-kube-api-access-dn2mx\") pod \"machine-config-operator-74547568cd-xtrg9\" (UID: \"3f9ec60a-e0c3-4a0c-8b43-809eb09fb365\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.941778 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzqhv\" (UniqueName: \"kubernetes.io/projected/3b6ef406-5003-4eb6-bf53-3a195fcface8-kube-api-access-mzqhv\") pod \"csi-hostpathplugin-9wgfm\" (UID: \"3b6ef406-5003-4eb6-bf53-3a195fcface8\") " pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.959847 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32fe940e-dd94-4dd9-921c-fcd99ddccb2a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-llkqs\" (UID: \"32fe940e-dd94-4dd9-921c-fcd99ddccb2a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.977688 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46xnv\" (UniqueName: \"kubernetes.io/projected/55540507-8d49-4b29-8c37-30d340e4eb1b-kube-api-access-46xnv\") pod \"service-ca-operator-777779d784-w8h6p\" (UID: \"55540507-8d49-4b29-8c37-30d340e4eb1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" Dec 05 20:12:47 crc kubenswrapper[4744]: I1205 20:12:47.992435 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:47 crc kubenswrapper[4744]: E1205 20:12:47.992836 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:48.492817818 +0000 UTC m=+138.722629186 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.009578 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.011493 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrkc4\" (UniqueName: \"kubernetes.io/projected/e9f2d8c0-c11b-4910-aa67-5be21f46b32d-kube-api-access-mrkc4\") pod \"dns-operator-744455d44c-g2wxh\" (UID: \"e9f2d8c0-c11b-4910-aa67-5be21f46b32d\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2wxh" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.022343 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.033841 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2k2q\" (UniqueName: \"kubernetes.io/projected/5b931ded-d187-4535-b266-0d17996f0b27-kube-api-access-v2k2q\") pod \"cluster-image-registry-operator-dc59b4c8b-6wx2p\" (UID: \"5b931ded-d187-4535-b266-0d17996f0b27\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.034973 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-g2wxh" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.038659 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwz8s\" (UniqueName: \"kubernetes.io/projected/15306917-0f1c-4f26-9eda-637d43a32172-kube-api-access-lwz8s\") pod \"machine-config-controller-84d6567774-cz77l\" (UID: \"15306917-0f1c-4f26-9eda-637d43a32172\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.045091 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.055017 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.059415 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbthh\" (UniqueName: \"kubernetes.io/projected/aea16266-db6e-4bd6-aac2-8dea60e44c25-kube-api-access-nbthh\") pod \"router-default-5444994796-lk6bf\" (UID: \"aea16266-db6e-4bd6-aac2-8dea60e44c25\") " pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.076377 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.083225 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhzdk\" (UniqueName: \"kubernetes.io/projected/51026683-03f8-44ad-bac5-f290d1eaf13d-kube-api-access-xhzdk\") pod \"machine-config-server-7hngj\" (UID: \"51026683-03f8-44ad-bac5-f290d1eaf13d\") " pod="openshift-machine-config-operator/machine-config-server-7hngj" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.093931 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:48 crc kubenswrapper[4744]: E1205 20:12:48.094250 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:48.594233395 +0000 UTC m=+138.824044763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.097624 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.099924 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/361522f8-b0a1-45d2-baa1-9779678fa54f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-48knz\" (UID: \"361522f8-b0a1-45d2-baa1-9779678fa54f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.107749 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.117245 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.123253 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.130861 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsszc\" (UniqueName: \"kubernetes.io/projected/8bfdca92-a782-4806-a2c0-e54302fd24a4-kube-api-access-bsszc\") pod \"marketplace-operator-79b997595-tr74j\" (UID: \"8bfdca92-a782-4806-a2c0-e54302fd24a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.138818 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwbmx\" (UniqueName: \"kubernetes.io/projected/9aad9ec0-529f-41f0-bbc7-5b16d4346c9b-kube-api-access-kwbmx\") pod \"kube-storage-version-migrator-operator-b67b599dd-gslcv\" (UID: \"9aad9ec0-529f-41f0-bbc7-5b16d4346c9b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.162811 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.167961 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7hngj" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.168761 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/943916ae-78c3-4ff3-8f1b-71c56ad874dd-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jc4pg\" (UID: \"943916ae-78c3-4ff3-8f1b-71c56ad874dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.178310 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm5wl\" (UniqueName: \"kubernetes.io/projected/ceb8b187-9126-4d1e-8201-b4d12a0d1e7a-kube-api-access-fm5wl\") pod \"olm-operator-6b444d44fb-qfxk4\" (UID: \"ceb8b187-9126-4d1e-8201-b4d12a0d1e7a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.195813 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:48 crc kubenswrapper[4744]: E1205 20:12:48.196205 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:48.696186436 +0000 UTC m=+138.925997824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.213621 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2qzb\" (UniqueName: \"kubernetes.io/projected/55e3c0e4-3a89-48b0-a218-f89546c09a5d-kube-api-access-d2qzb\") pod \"console-f9d7485db-lgg2b\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.215863 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.228419 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxlnn\" (UniqueName: \"kubernetes.io/projected/57ee820b-1f44-41e2-b44b-b6bb25edb5af-kube-api-access-bxlnn\") pod \"openshift-apiserver-operator-796bbdcf4f-n7t68\" (UID: \"57ee820b-1f44-41e2-b44b-b6bb25edb5af\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.258783 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.275456 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.283621 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.294245 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.297468 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:48 crc kubenswrapper[4744]: E1205 20:12:48.297673 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:48.797642795 +0000 UTC m=+139.027454163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.299332 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.300735 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz" Dec 05 20:12:48 crc kubenswrapper[4744]: E1205 20:12:48.301814 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:48.801786691 +0000 UTC m=+139.031598089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.366760 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.388760 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg" Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.400404 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:48 crc kubenswrapper[4744]: E1205 20:12:48.400553 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:48.900521749 +0000 UTC m=+139.130333157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.401263 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:48 crc kubenswrapper[4744]: E1205 20:12:48.401851 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:48.901811822 +0000 UTC m=+139.131623200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.502848 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:48 crc kubenswrapper[4744]: E1205 20:12:48.502893 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.00286299 +0000 UTC m=+139.232674398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.503252 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:48 crc kubenswrapper[4744]: E1205 20:12:48.503815 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.003792504 +0000 UTC m=+139.233603912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.604343 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:48 crc kubenswrapper[4744]: E1205 20:12:48.604757 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.104716589 +0000 UTC m=+139.334528007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.605246 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:48 crc kubenswrapper[4744]: E1205 20:12:48.605795 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.105778316 +0000 UTC m=+139.335589724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.707281 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:48 crc kubenswrapper[4744]: E1205 20:12:48.707622 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.207586973 +0000 UTC m=+139.437398381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.708707 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:48 crc kubenswrapper[4744]: E1205 20:12:48.709120 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.209094623 +0000 UTC m=+139.438906031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.809857 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:48 crc kubenswrapper[4744]: E1205 20:12:48.812988 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.312920692 +0000 UTC m=+139.542732090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.816733 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:48 crc kubenswrapper[4744]: E1205 20:12:48.817511 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.317486739 +0000 UTC m=+139.547298137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:48 crc kubenswrapper[4744]: I1205 20:12:48.917747 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:48 crc kubenswrapper[4744]: E1205 20:12:48.918368 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.418342802 +0000 UTC m=+139.648154200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:49 crc kubenswrapper[4744]: I1205 20:12:49.019750 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:49 crc kubenswrapper[4744]: E1205 20:12:49.020654 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.520624881 +0000 UTC m=+139.750436289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:49 crc kubenswrapper[4744]: I1205 20:12:49.121217 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:49 crc kubenswrapper[4744]: E1205 20:12:49.121860 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.621823022 +0000 UTC m=+139.851634420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:49 crc kubenswrapper[4744]: I1205 20:12:49.223237 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:49 crc kubenswrapper[4744]: E1205 20:12:49.223727 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.723706862 +0000 UTC m=+139.953518240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:49 crc kubenswrapper[4744]: I1205 20:12:49.325630 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:49 crc kubenswrapper[4744]: E1205 20:12:49.325804 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.825777636 +0000 UTC m=+140.055588994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:49 crc kubenswrapper[4744]: I1205 20:12:49.326018 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:49 crc kubenswrapper[4744]: E1205 20:12:49.326407 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.826392351 +0000 UTC m=+140.056203729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:49 crc kubenswrapper[4744]: I1205 20:12:49.427171 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:49 crc kubenswrapper[4744]: E1205 20:12:49.427274 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.927250544 +0000 UTC m=+140.157061912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:49 crc kubenswrapper[4744]: I1205 20:12:49.427511 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:49 crc kubenswrapper[4744]: E1205 20:12:49.427867 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:49.92785381 +0000 UTC m=+140.157665178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:49 crc kubenswrapper[4744]: I1205 20:12:49.528964 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:49 crc kubenswrapper[4744]: E1205 20:12:49.529172 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.029147424 +0000 UTC m=+140.258958802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:49 crc kubenswrapper[4744]: I1205 20:12:49.529288 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:49 crc kubenswrapper[4744]: E1205 20:12:49.529688 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.029673538 +0000 UTC m=+140.259484916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:49 crc kubenswrapper[4744]: I1205 20:12:49.630232 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:49 crc kubenswrapper[4744]: E1205 20:12:49.630420 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.130391988 +0000 UTC m=+140.360203366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:49 crc kubenswrapper[4744]: I1205 20:12:49.630670 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:49 crc kubenswrapper[4744]: E1205 20:12:49.631071 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.131054524 +0000 UTC m=+140.360865912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:49 crc kubenswrapper[4744]: I1205 20:12:49.731894 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:49 crc kubenswrapper[4744]: E1205 20:12:49.732036 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.23201764 +0000 UTC m=+140.461829008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:49 crc kubenswrapper[4744]: I1205 20:12:49.732204 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:49 crc kubenswrapper[4744]: E1205 20:12:49.732540 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.232529143 +0000 UTC m=+140.462340511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:49 crc kubenswrapper[4744]: I1205 20:12:49.832822 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:49 crc kubenswrapper[4744]: E1205 20:12:49.833549 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.33352106 +0000 UTC m=+140.563332468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:49 crc kubenswrapper[4744]: I1205 20:12:49.935166 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:49 crc kubenswrapper[4744]: E1205 20:12:49.935682 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.435656995 +0000 UTC m=+140.665468403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.036365 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:50 crc kubenswrapper[4744]: E1205 20:12:50.036546 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.536511068 +0000 UTC m=+140.766322446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.041564 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:50 crc kubenswrapper[4744]: E1205 20:12:50.042375 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.542337298 +0000 UTC m=+140.772148716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.148633 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:50 crc kubenswrapper[4744]: E1205 20:12:50.149337 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.649250716 +0000 UTC m=+140.879062124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.250340 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:50 crc kubenswrapper[4744]: E1205 20:12:50.250844 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.750823508 +0000 UTC m=+140.980634906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.352867 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:50 crc kubenswrapper[4744]: E1205 20:12:50.353358 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.853266471 +0000 UTC m=+141.083077879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.353561 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:50 crc kubenswrapper[4744]: E1205 20:12:50.354395 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.854277507 +0000 UTC m=+141.084088905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.454896 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:50 crc kubenswrapper[4744]: E1205 20:12:50.455241 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.955200382 +0000 UTC m=+141.185011790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.455407 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:50 crc kubenswrapper[4744]: E1205 20:12:50.455831 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:50.955810557 +0000 UTC m=+141.185621945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.556126 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:50 crc kubenswrapper[4744]: E1205 20:12:50.556450 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:51.056436124 +0000 UTC m=+141.286247492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: W1205 20:12:50.574553 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod818b6964_1c62_4e2e_8079_a41f9bdcb763.slice/crio-94a7448d139a78b2309769f3e8edfe03b92498b59d52cc767c68e2d9defc4b3f WatchSource:0}: Error finding container 94a7448d139a78b2309769f3e8edfe03b92498b59d52cc767c68e2d9defc4b3f: Status 404 returned error can't find the container with id 94a7448d139a78b2309769f3e8edfe03b92498b59d52cc767c68e2d9defc4b3f Dec 05 20:12:50 crc kubenswrapper[4744]: W1205 20:12:50.578077 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9429a50e_b1ff_480d_b8af_d0f095f8cd86.slice/crio-52a2ebed82a774207cec77751352b48d4b283fc2a9207d7cbbed035ea7339aa7 WatchSource:0}: Error finding container 52a2ebed82a774207cec77751352b48d4b283fc2a9207d7cbbed035ea7339aa7: Status 404 returned error can't find the container with id 52a2ebed82a774207cec77751352b48d4b283fc2a9207d7cbbed035ea7339aa7 Dec 05 20:12:50 crc kubenswrapper[4744]: W1205 20:12:50.581383 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79d58c0b_affd_462b_b4ee_1134ede8bcb5.slice/crio-4230514a01cbdf125091e6794b51ec5900f000ab1d3e132d02e42939c09ef9a9 WatchSource:0}: Error finding container 4230514a01cbdf125091e6794b51ec5900f000ab1d3e132d02e42939c09ef9a9: Status 404 returned error can't find the container with id 4230514a01cbdf125091e6794b51ec5900f000ab1d3e132d02e42939c09ef9a9 Dec 05 20:12:50 crc kubenswrapper[4744]: W1205 20:12:50.581663 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda971a99c_926f_48f4_88d5_9033085cc89b.slice/crio-d941785cb4d08c4e60584acd8cb1f49fbb5b7cf8d96b90148a1b572331510519 WatchSource:0}: Error finding container d941785cb4d08c4e60584acd8cb1f49fbb5b7cf8d96b90148a1b572331510519: Status 404 returned error can't find the container with id d941785cb4d08c4e60584acd8cb1f49fbb5b7cf8d96b90148a1b572331510519 Dec 05 20:12:50 crc kubenswrapper[4744]: W1205 20:12:50.583738 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5196711d_0b39_4630_a0bc_d210d210fc4b.slice/crio-62b2ef2ac9839342a5886f4e13be6fbb05f05849416ca1c4762bbcab77ed8b2b WatchSource:0}: Error finding container 62b2ef2ac9839342a5886f4e13be6fbb05f05849416ca1c4762bbcab77ed8b2b: Status 404 returned error can't find the container with id 62b2ef2ac9839342a5886f4e13be6fbb05f05849416ca1c4762bbcab77ed8b2b Dec 05 20:12:50 crc kubenswrapper[4744]: W1205 20:12:50.592432 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3532c9be_fdf5_43e2_b5ba_95a678fef5f8.slice/crio-e2ce46affad7948a69f45e2f299165e78b0ec6404434a3e7aee362c80a2e8e3c WatchSource:0}: Error finding container e2ce46affad7948a69f45e2f299165e78b0ec6404434a3e7aee362c80a2e8e3c: Status 404 returned error can't find the container with id e2ce46affad7948a69f45e2f299165e78b0ec6404434a3e7aee362c80a2e8e3c Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.657821 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:50 crc kubenswrapper[4744]: E1205 20:12:50.659403 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:51.159320149 +0000 UTC m=+141.389131527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.759119 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:50 crc kubenswrapper[4744]: E1205 20:12:50.759317 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:51.259278089 +0000 UTC m=+141.489089457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.760396 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:50 crc kubenswrapper[4744]: E1205 20:12:50.760721 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:51.260704966 +0000 UTC m=+141.490516334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.863622 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:50 crc kubenswrapper[4744]: E1205 20:12:50.863961 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:51.36394246 +0000 UTC m=+141.593753838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.864340 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:50 crc kubenswrapper[4744]: E1205 20:12:50.864631 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:51.364621687 +0000 UTC m=+141.594433075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.868017 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qv6mb"] Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.923937 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ql5gr" event={"ID":"3532c9be-fdf5-43e2-b5ba-95a678fef5f8","Type":"ContainerStarted","Data":"e2ce46affad7948a69f45e2f299165e78b0ec6404434a3e7aee362c80a2e8e3c"} Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.943173 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb" event={"ID":"a971a99c-926f-48f4-88d5-9033085cc89b","Type":"ContainerStarted","Data":"d941785cb4d08c4e60584acd8cb1f49fbb5b7cf8d96b90148a1b572331510519"} Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.961346 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" event={"ID":"818b6964-1c62-4e2e-8079-a41f9bdcb763","Type":"ContainerStarted","Data":"94a7448d139a78b2309769f3e8edfe03b92498b59d52cc767c68e2d9defc4b3f"} Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.964934 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:50 crc kubenswrapper[4744]: E1205 20:12:50.965279 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:51.465265805 +0000 UTC m=+141.695077173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.965433 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-htzxr" event={"ID":"79d58c0b-affd-462b-b4ee-1134ede8bcb5","Type":"ContainerStarted","Data":"4230514a01cbdf125091e6794b51ec5900f000ab1d3e132d02e42939c09ef9a9"} Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.990918 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" event={"ID":"5196711d-0b39-4630-a0bc-d210d210fc4b","Type":"ContainerStarted","Data":"62b2ef2ac9839342a5886f4e13be6fbb05f05849416ca1c4762bbcab77ed8b2b"} Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.991875 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" event={"ID":"9429a50e-b1ff-480d-b8af-d0f095f8cd86","Type":"ContainerStarted","Data":"52a2ebed82a774207cec77751352b48d4b283fc2a9207d7cbbed035ea7339aa7"} Dec 05 20:12:50 crc kubenswrapper[4744]: I1205 20:12:50.992930 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" event={"ID":"7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0","Type":"ContainerStarted","Data":"e784d293c50910fabeb03e24045c44e98b487aa2299f0d1a0b6eac24ae2bdab4"} Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.020054 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4g9th"] Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.068846 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:51 crc kubenswrapper[4744]: E1205 20:12:51.069139 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:51.569127094 +0000 UTC m=+141.798938462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:51 crc kubenswrapper[4744]: E1205 20:12:51.086901 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc941b3ea_ef53_47c4_b10a_6e949b7098d2.slice/crio-conmon-18d4222877aa03cd29c1e79c31f84b06624e4922a35fe677e8e3f605d3fe34ae.scope\": RecentStats: unable to find data in memory cache]" Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.140131 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6f79s"] Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.143220 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dn5pv"] Dec 05 20:12:51 crc kubenswrapper[4744]: E1205 20:12:51.169362 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:51.669345791 +0000 UTC m=+141.899157159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.169402 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.169654 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:51 crc kubenswrapper[4744]: E1205 20:12:51.169916 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:51.669909936 +0000 UTC m=+141.899721304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.271460 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:51 crc kubenswrapper[4744]: E1205 20:12:51.271816 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:51.771785705 +0000 UTC m=+142.001597073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.271898 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:51 crc kubenswrapper[4744]: E1205 20:12:51.275191 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:51.775170021 +0000 UTC m=+142.004981389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.346022 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lgg2b"] Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.352991 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr"] Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.380278 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.380575 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nxjnb"] Dec 05 20:12:51 crc kubenswrapper[4744]: E1205 20:12:51.380590 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:51.880566792 +0000 UTC m=+142.110378160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.381764 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:51 crc kubenswrapper[4744]: E1205 20:12:51.400827 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:51.900806071 +0000 UTC m=+142.130617439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.482796 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:51 crc kubenswrapper[4744]: E1205 20:12:51.483129 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:51.983112518 +0000 UTC m=+142.212923876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.584708 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:51 crc kubenswrapper[4744]: E1205 20:12:51.585885 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:52.085867299 +0000 UTC m=+142.315678667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.686737 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:51 crc kubenswrapper[4744]: E1205 20:12:51.687107 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:52.187091291 +0000 UTC m=+142.416902659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.788563 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:51 crc kubenswrapper[4744]: E1205 20:12:51.795546 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:52.295526949 +0000 UTC m=+142.525338307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.893255 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:51 crc kubenswrapper[4744]: E1205 20:12:51.893619 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:52.393605121 +0000 UTC m=+142.623416489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.983329 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm"] Dec 05 20:12:51 crc kubenswrapper[4744]: I1205 20:12:51.995067 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:51 crc kubenswrapper[4744]: E1205 20:12:51.995363 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:52.495351967 +0000 UTC m=+142.725163335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.024469 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" event={"ID":"9429a50e-b1ff-480d-b8af-d0f095f8cd86","Type":"ContainerStarted","Data":"c541f34d10c5cf65375ed3854441a4ea31bd298c4d4da83e7ee6bc251aa2e003"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.024797 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.033170 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.034308 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" event={"ID":"f9c687ae-84e1-44ed-801d-abbbff13acd9","Type":"ContainerStarted","Data":"145066b4556f16257df7179e48587828e23fb99560333d69419e4c3f39cd8eac"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.037606 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6f79s" event={"ID":"52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9","Type":"ContainerStarted","Data":"74caa2a0952600f4ed4476077545905ccfe90bc2c1baca56561943c9a362389a"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.037628 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6f79s" event={"ID":"52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9","Type":"ContainerStarted","Data":"08d4b61dd68059bf8da6b38e949d469b74977969d5d23833f6b5ccafa6f28846"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.038874 4744 generic.go:334] "Generic (PLEG): container finished" podID="c941b3ea-ef53-47c4-b10a-6e949b7098d2" containerID="18d4222877aa03cd29c1e79c31f84b06624e4922a35fe677e8e3f605d3fe34ae" exitCode=0 Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.040276 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" event={"ID":"c941b3ea-ef53-47c4-b10a-6e949b7098d2","Type":"ContainerDied","Data":"18d4222877aa03cd29c1e79c31f84b06624e4922a35fe677e8e3f605d3fe34ae"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.046529 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" event={"ID":"7e9b7529-8e9b-4ecc-9fd0-c43e9de6edb0","Type":"ContainerStarted","Data":"d8c8ffd0e1b893ace4aa026fe41f28290ac2c753a153d3f7d73fea3b39a691a6"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.064178 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lk6bf" event={"ID":"aea16266-db6e-4bd6-aac2-8dea60e44c25","Type":"ContainerStarted","Data":"de3aa555ee550981dd3f995bc8b4068088ea1b2500e6ecc1b0275906ef450164"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.064217 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lk6bf" event={"ID":"aea16266-db6e-4bd6-aac2-8dea60e44c25","Type":"ContainerStarted","Data":"e43259c9b622b0ae9c5d4400a1549ba71f0449aa30e59cdb7d653a70e5412f3e"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.069332 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4g9th" event={"ID":"3b5781ba-a2d3-416c-ba05-8ebfb67bba25","Type":"ContainerStarted","Data":"fb7ed76738ecf09990e96fb4cb1b596dc63e6f93a9344a377f4931a79741fc6b"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.069366 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4g9th" event={"ID":"3b5781ba-a2d3-416c-ba05-8ebfb67bba25","Type":"ContainerStarted","Data":"d79755eda7b0df17182add332e80801edb6cf946917c308f26d8f91118a849fa"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.074646 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" podStartSLOduration=122.074632754 podStartE2EDuration="2m2.074632754s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:52.055651867 +0000 UTC m=+142.285463235" watchObservedRunningTime="2025-12-05 20:12:52.074632754 +0000 UTC m=+142.304444122" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.102670 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:52 crc kubenswrapper[4744]: E1205 20:12:52.103399 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:52.603384444 +0000 UTC m=+142.833195812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.106315 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9rmhw" podStartSLOduration=124.106278498 podStartE2EDuration="2m4.106278498s" podCreationTimestamp="2025-12-05 20:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:52.076519643 +0000 UTC m=+142.306331011" watchObservedRunningTime="2025-12-05 20:12:52.106278498 +0000 UTC m=+142.336089866" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.130342 4744 patch_prober.go:28] interesting pod/router-default-5444994796-lk6bf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:12:52 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Dec 05 20:12:52 crc kubenswrapper[4744]: [+]process-running ok Dec 05 20:12:52 crc kubenswrapper[4744]: healthz check failed Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.130391 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lk6bf" podUID="aea16266-db6e-4bd6-aac2-8dea60e44c25" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.131433 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb" event={"ID":"a971a99c-926f-48f4-88d5-9033085cc89b","Type":"ContainerStarted","Data":"e12739b22cb5e80b10151a16b16e281122d4ba9440055deeb54910185e3998be"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.131463 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.138255 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7hngj" event={"ID":"51026683-03f8-44ad-bac5-f290d1eaf13d","Type":"ContainerStarted","Data":"6ee6dfc332a7c64e0cdc67574f1fdf6c4c1e86e2bee5b4d5ce4b9e05fc8db80e"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.138312 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7hngj" event={"ID":"51026683-03f8-44ad-bac5-f290d1eaf13d","Type":"ContainerStarted","Data":"31cfd6aec8d3495352ce63493e100e34fb987f3547a305c8d4d8cbdd84b49992"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.150196 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnwkb" podStartSLOduration=121.150178146 podStartE2EDuration="2m1.150178146s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:52.147451586 +0000 UTC m=+142.377262954" watchObservedRunningTime="2025-12-05 20:12:52.150178146 +0000 UTC m=+142.379989524" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.174896 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ql5gr" event={"ID":"3532c9be-fdf5-43e2-b5ba-95a678fef5f8","Type":"ContainerStarted","Data":"72fc53f41b228748adb6ff35c406c9f4c40fc91f51b8ebc4a9c397cd73c51497"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.201594 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" event={"ID":"818b6964-1c62-4e2e-8079-a41f9bdcb763","Type":"ContainerStarted","Data":"d4a4eca8b8299868f9c2520493bd15002be54924158489763b5733aa8765af3a"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.204739 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:52 crc kubenswrapper[4744]: E1205 20:12:52.210715 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:52.710700622 +0000 UTC m=+142.940511990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.213928 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-lk6bf" podStartSLOduration=121.213911295 podStartE2EDuration="2m1.213911295s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:52.188723438 +0000 UTC m=+142.418534806" watchObservedRunningTime="2025-12-05 20:12:52.213911295 +0000 UTC m=+142.443722663" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.227162 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nxjnb" event={"ID":"fc23ad84-d2b5-4f8b-a110-143219eb78a9","Type":"ContainerStarted","Data":"da04384a1e679a30fb5df23d19473e8c6dc1e76935000bc78e70461ecbd79845"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.245033 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4g9th" podStartSLOduration=7.245014585 podStartE2EDuration="7.245014585s" podCreationTimestamp="2025-12-05 20:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:52.21447659 +0000 UTC m=+142.444287958" watchObservedRunningTime="2025-12-05 20:12:52.245014585 +0000 UTC m=+142.474825953" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.254400 4744 generic.go:334] "Generic (PLEG): container finished" podID="79d58c0b-affd-462b-b4ee-1134ede8bcb5" containerID="2510c6a16163e92d48ea170f17894b1a89a2621355fd54cfb63b4ca22f707d82" exitCode=0 Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.254493 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-htzxr" event={"ID":"79d58c0b-affd-462b-b4ee-1134ede8bcb5","Type":"ContainerDied","Data":"2510c6a16163e92d48ea170f17894b1a89a2621355fd54cfb63b4ca22f707d82"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.260995 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" event={"ID":"492a9a03-8b00-4fc5-aa95-98a11aa090c7","Type":"ContainerStarted","Data":"358b4f09bbbff93e314f12d247fcbb591b9c5b0558184aeaf7a12ddea6d7bcc8"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.261513 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.264564 4744 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-2rvfr container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.264593 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" podUID="492a9a03-8b00-4fc5-aa95-98a11aa090c7" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.265575 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ql5gr" podStartSLOduration=121.265559653 podStartE2EDuration="2m1.265559653s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:52.264782633 +0000 UTC m=+142.494594001" watchObservedRunningTime="2025-12-05 20:12:52.265559653 +0000 UTC m=+142.495371021" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.265645 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-7hngj" podStartSLOduration=7.265642115 podStartE2EDuration="7.265642115s" podCreationTimestamp="2025-12-05 20:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:52.246310458 +0000 UTC m=+142.476121826" watchObservedRunningTime="2025-12-05 20:12:52.265642115 +0000 UTC m=+142.495453483" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.281683 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lgg2b" event={"ID":"55e3c0e4-3a89-48b0-a218-f89546c09a5d","Type":"ContainerStarted","Data":"660fac3d7a65e1f8968356f827b875088609e7e24c40cc6c4dd47da4d7ed80c7"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.297956 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" event={"ID":"5196711d-0b39-4630-a0bc-d210d210fc4b","Type":"ContainerStarted","Data":"016f66e9da7605657a37b83002b6bfffe2ed4d55cd6df2efb6e2aba0dc8b5ee4"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.298915 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.304067 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" event={"ID":"2fd9747b-ba54-4fa6-8849-7447d6683c68","Type":"ContainerStarted","Data":"c6df0e1efb98705e59351993c3c0648bdb1ae45dc0ef73a99797f7a589094d4c"} Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.304094 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" event={"ID":"2fd9747b-ba54-4fa6-8849-7447d6683c68","Type":"ContainerStarted","Data":"026f064367198101ccf4bf23c74bc491e67d140f2d81a86552d660042acd8338"} Dec 05 20:12:52 crc kubenswrapper[4744]: E1205 20:12:52.310850 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:52.810828247 +0000 UTC m=+143.040639615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.311670 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.312060 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:52 crc kubenswrapper[4744]: E1205 20:12:52.313882 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:52.813869485 +0000 UTC m=+143.043680853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.320851 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.336676 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" podStartSLOduration=121.33666015 podStartE2EDuration="2m1.33666015s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:52.309768219 +0000 UTC m=+142.539579587" watchObservedRunningTime="2025-12-05 20:12:52.33666015 +0000 UTC m=+142.566471518" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.338579 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" podStartSLOduration=122.33857223 podStartE2EDuration="2m2.33857223s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:52.336047735 +0000 UTC m=+142.565859103" watchObservedRunningTime="2025-12-05 20:12:52.33857223 +0000 UTC m=+142.568383598" Dec 05 20:12:52 crc kubenswrapper[4744]: W1205 20:12:52.376414 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21c94501_58a7_4b02_94aa_2fc8035777e3.slice/crio-a4e5d520c7f0807ae3438d46437f9433ca6adfeec6db06698400b886f8dbbed4 WatchSource:0}: Error finding container a4e5d520c7f0807ae3438d46437f9433ca6adfeec6db06698400b886f8dbbed4: Status 404 returned error can't find the container with id a4e5d520c7f0807ae3438d46437f9433ca6adfeec6db06698400b886f8dbbed4 Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.389620 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ctwg7"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.393355 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.396102 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-lgg2b" podStartSLOduration=122.396079488 podStartE2EDuration="2m2.396079488s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:52.377416419 +0000 UTC m=+142.607227787" watchObservedRunningTime="2025-12-05 20:12:52.396079488 +0000 UTC m=+142.625890856" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.412830 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:52 crc kubenswrapper[4744]: E1205 20:12:52.414403 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:52.914384309 +0000 UTC m=+143.144195677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.434840 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j55sf" podStartSLOduration=121.434823045 podStartE2EDuration="2m1.434823045s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:52.39926127 +0000 UTC m=+142.629072648" watchObservedRunningTime="2025-12-05 20:12:52.434823045 +0000 UTC m=+142.664634403" Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.435236 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.513876 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:52 crc kubenswrapper[4744]: E1205 20:12:52.514202 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:53.014190515 +0000 UTC m=+143.244001873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.614593 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:52 crc kubenswrapper[4744]: E1205 20:12:52.615095 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:53.115081378 +0000 UTC m=+143.344892746 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.709096 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gw9l6"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.711254 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.719256 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:52 crc kubenswrapper[4744]: E1205 20:12:52.719565 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:53.219554044 +0000 UTC m=+143.449365412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.730048 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.752249 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9wgfm"] Dec 05 20:12:52 crc kubenswrapper[4744]: W1205 20:12:52.754161 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dd0664e_36e7_48d4_bfbe_76cdf69883b6.slice/crio-936b3bb5244d86bdd415cd93df582ba42a279c2f7f1472910f435080270058ec WatchSource:0}: Error finding container 936b3bb5244d86bdd415cd93df582ba42a279c2f7f1472910f435080270058ec: Status 404 returned error can't find the container with id 936b3bb5244d86bdd415cd93df582ba42a279c2f7f1472910f435080270058ec Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.773971 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zhpng"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.792382 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.792417 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.797174 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.802475 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.808415 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l6gl7"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.819955 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:52 crc kubenswrapper[4744]: E1205 20:12:52.820403 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:53.320277194 +0000 UTC m=+143.550088562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:52 crc kubenswrapper[4744]: W1205 20:12:52.827557 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaff0752e_d15d_4137_a5a4_ed8c29efbc74.slice/crio-d6b3209386317bd485c311eac9e567da0ad5a888f79315b0be0f5a44588dfbe5 WatchSource:0}: Error finding container d6b3209386317bd485c311eac9e567da0ad5a888f79315b0be0f5a44588dfbe5: Status 404 returned error can't find the container with id d6b3209386317bd485c311eac9e567da0ad5a888f79315b0be0f5a44588dfbe5 Dec 05 20:12:52 crc kubenswrapper[4744]: W1205 20:12:52.827860 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53f9a23a_b663_4cbf_8c34_334f073e3092.slice/crio-8610abbc0652d06c8f8a195c1cacf07c292f848d22aa2a687d219f35584d0b21 WatchSource:0}: Error finding container 8610abbc0652d06c8f8a195c1cacf07c292f848d22aa2a687d219f35584d0b21: Status 404 returned error can't find the container with id 8610abbc0652d06c8f8a195c1cacf07c292f848d22aa2a687d219f35584d0b21 Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.848871 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.850552 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tr74j"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.858831 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.863151 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.872335 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9hjlq"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.872367 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.877358 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g2wxh"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.877675 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.880597 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-d6kbr"] Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.882826 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9"] Dec 05 20:12:52 crc kubenswrapper[4744]: W1205 20:12:52.895907 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55540507_8d49_4b29_8c37_30d340e4eb1b.slice/crio-8d641e0079a49cb30bc0158c9f34f0730a22da59ee57958bca16cc928df3a578 WatchSource:0}: Error finding container 8d641e0079a49cb30bc0158c9f34f0730a22da59ee57958bca16cc928df3a578: Status 404 returned error can't find the container with id 8d641e0079a49cb30bc0158c9f34f0730a22da59ee57958bca16cc928df3a578 Dec 05 20:12:52 crc kubenswrapper[4744]: I1205 20:12:52.920906 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:52 crc kubenswrapper[4744]: E1205 20:12:52.921156 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:53.421145997 +0000 UTC m=+143.650957355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:52 crc kubenswrapper[4744]: W1205 20:12:52.949504 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod361522f8_b0a1_45d2_baa1_9779678fa54f.slice/crio-803f1479439e2f08d8f4d6e188604f22b5708d6641b49efd30eeb0f57ea508b3 WatchSource:0}: Error finding container 803f1479439e2f08d8f4d6e188604f22b5708d6641b49efd30eeb0f57ea508b3: Status 404 returned error can't find the container with id 803f1479439e2f08d8f4d6e188604f22b5708d6641b49efd30eeb0f57ea508b3 Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.030778 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:53 crc kubenswrapper[4744]: E1205 20:12:53.031210 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:53.531195226 +0000 UTC m=+143.761006594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.129255 4744 patch_prober.go:28] interesting pod/router-default-5444994796-lk6bf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:12:53 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Dec 05 20:12:53 crc kubenswrapper[4744]: [+]process-running ok Dec 05 20:12:53 crc kubenswrapper[4744]: healthz check failed Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.129323 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lk6bf" podUID="aea16266-db6e-4bd6-aac2-8dea60e44c25" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.132232 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:53 crc kubenswrapper[4744]: E1205 20:12:53.132503 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:53.63249006 +0000 UTC m=+143.862301428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.233130 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:53 crc kubenswrapper[4744]: E1205 20:12:53.233924 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:53.733903477 +0000 UTC m=+143.963714845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.336932 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:53 crc kubenswrapper[4744]: E1205 20:12:53.337254 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:53.837242444 +0000 UTC m=+144.067053812 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.347590 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" event={"ID":"476c0833-0a8f-4824-a7fe-6f28aada483b","Type":"ContainerStarted","Data":"f018f5710b57407bb4d7ad9e142431edfdec6bedb229e8163750772064d2514a"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.347630 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" event={"ID":"476c0833-0a8f-4824-a7fe-6f28aada483b","Type":"ContainerStarted","Data":"6f7d8c55d67041289ad2a2b9f1ed9fbecbbd39f709c188a1f00a099233db6a77"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.388084 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" event={"ID":"f9c687ae-84e1-44ed-801d-abbbff13acd9","Type":"ContainerStarted","Data":"54793a0d17b32ddf541006453f42a34d0380e7c127b9e8c6f5d779cb10694b40"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.388756 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.401963 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" event={"ID":"492a9a03-8b00-4fc5-aa95-98a11aa090c7","Type":"ContainerStarted","Data":"c9a2b2fe8880681afb5d92c8024eb35e7c462c1bd781c5be7a69f1efb901e851"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.414647 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" podStartSLOduration=123.414633553 podStartE2EDuration="2m3.414633553s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:53.413322699 +0000 UTC m=+143.643134067" watchObservedRunningTime="2025-12-05 20:12:53.414633553 +0000 UTC m=+143.644444921" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.420177 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2rvfr" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.429071 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l" event={"ID":"15306917-0f1c-4f26-9eda-637d43a32172","Type":"ContainerStarted","Data":"de3b438497eaad5ac2e8cf9ab2a34a744a5507a63c548ca548ba8dfcf17c323f"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.432795 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz" event={"ID":"09993f0f-6381-4517-8246-ef1d188bea5c","Type":"ContainerStarted","Data":"41bce51eb5531ae3b3c844680d3510bee9730699b23438127da1504dfed727c8"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.432829 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz" event={"ID":"09993f0f-6381-4517-8246-ef1d188bea5c","Type":"ContainerStarted","Data":"0ada10cf1dd7877d08d7f0cba4dc881d1171ef5413cbc7d4a6eee50e224efc87"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.439626 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.442276 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:53 crc kubenswrapper[4744]: E1205 20:12:53.442885 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:53.942864709 +0000 UTC m=+144.172676077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.445904 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6f79s" event={"ID":"52304d1d-cc58-4ce5-91b7-d8a2dec7a5d9","Type":"ContainerStarted","Data":"de9b3399f22b0d719ee67160046226a9c0a434141a1ce14fb49cbcb5d8b8f755"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.446425 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-6f79s" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.449199 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" event={"ID":"3f9ec60a-e0c3-4a0c-8b43-809eb09fb365","Type":"ContainerStarted","Data":"0e29b52bed21288675eb6a65d627111977720591791c88111c8404cde24d02bb"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.491691 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" event={"ID":"53f9a23a-b663-4cbf-8c34-334f073e3092","Type":"ContainerStarted","Data":"8610abbc0652d06c8f8a195c1cacf07c292f848d22aa2a687d219f35584d0b21"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.514529 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rjqnz" podStartSLOduration=123.514515242 podStartE2EDuration="2m3.514515242s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:53.497918055 +0000 UTC m=+143.727729423" watchObservedRunningTime="2025-12-05 20:12:53.514515242 +0000 UTC m=+143.744326610" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.546768 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:53 crc kubenswrapper[4744]: E1205 20:12:53.548558 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:54.048542966 +0000 UTC m=+144.278354334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.564589 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv" event={"ID":"9aad9ec0-529f-41f0-bbc7-5b16d4346c9b","Type":"ContainerStarted","Data":"c9ee571e800cc19e5f039720a54c0c497dce853065f239191b32f90f771ca24f"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.564642 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv" event={"ID":"9aad9ec0-529f-41f0-bbc7-5b16d4346c9b","Type":"ContainerStarted","Data":"756bedf27e08482a752cb86cad532dd299a9b69711125720e0b1b7f57290b292"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.600634 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6f79s" podStartSLOduration=8.600599084 podStartE2EDuration="8.600599084s" podCreationTimestamp="2025-12-05 20:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:53.515318422 +0000 UTC m=+143.745129790" watchObservedRunningTime="2025-12-05 20:12:53.600599084 +0000 UTC m=+143.830410452" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.602454 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gslcv" podStartSLOduration=122.602448071 podStartE2EDuration="2m2.602448071s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:53.599066865 +0000 UTC m=+143.828878233" watchObservedRunningTime="2025-12-05 20:12:53.602448071 +0000 UTC m=+143.832259439" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.608769 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ctwg7" event={"ID":"21c94501-58a7-4b02-94aa-2fc8035777e3","Type":"ContainerStarted","Data":"2efcdbd515d95f7250cf561e8602aeed311b205c0c1a25eb9fda3db0b51dd875"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.608815 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ctwg7" event={"ID":"21c94501-58a7-4b02-94aa-2fc8035777e3","Type":"ContainerStarted","Data":"51e124dccc092f9de64ed55500bb3cbd2107abc2c59e044135857da84bf30449"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.608824 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ctwg7" event={"ID":"21c94501-58a7-4b02-94aa-2fc8035777e3","Type":"ContainerStarted","Data":"a4e5d520c7f0807ae3438d46437f9433ca6adfeec6db06698400b886f8dbbed4"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.623552 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" event={"ID":"5b931ded-d187-4535-b266-0d17996f0b27","Type":"ContainerStarted","Data":"ceac37771e7550558219506da426a781cb44b8a3a6455ee78a5d1d38f92e7c21"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.632369 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" event={"ID":"8bfdca92-a782-4806-a2c0-e54302fd24a4","Type":"ContainerStarted","Data":"0ef306a513dc0ee16ea9f8bda71f5f6eefe8a0b4bc2465c1f7246d03221cc1cf"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.653720 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:53 crc kubenswrapper[4744]: E1205 20:12:53.654090 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:54.154076369 +0000 UTC m=+144.383887737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.656038 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68" event={"ID":"57ee820b-1f44-41e2-b44b-b6bb25edb5af","Type":"ContainerStarted","Data":"826d0ed14afd8a0be5a82f1b663c9d4270c87efe6f3c8bb6275aacad33c38246"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.681356 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" event={"ID":"a099a621-9515-4776-bc62-12fb0fa62340","Type":"ContainerStarted","Data":"610823415b68bb2992cf1585648c8933a490cd303526195ac7698f3f9cfaa493"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.694312 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lgg2b" event={"ID":"55e3c0e4-3a89-48b0-a218-f89546c09a5d","Type":"ContainerStarted","Data":"7105d8472e4363e2fc82c8c4b5bfae502cd8a4cad41e5e729eabbf09a2090d86"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.713879 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-htzxr" event={"ID":"79d58c0b-affd-462b-b4ee-1134ede8bcb5","Type":"ContainerStarted","Data":"e0749af51f3f0423af51f548da7e1466caef8ba231bbaa48a4fcf988bc783e12"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.721448 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" event={"ID":"55540507-8d49-4b29-8c37-30d340e4eb1b","Type":"ContainerStarted","Data":"8d641e0079a49cb30bc0158c9f34f0730a22da59ee57958bca16cc928df3a578"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.723452 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" event={"ID":"c941b3ea-ef53-47c4-b10a-6e949b7098d2","Type":"ContainerStarted","Data":"b07482bfa423f429e1d77c231c940898f9a6108a97df7f0780bcafc6d128a4ed"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.724808 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-d6kbr" event={"ID":"c8430c27-e731-481d-8579-06bd5c157f2c","Type":"ContainerStarted","Data":"92858c69f057b0ab60d1847d3fd433fa2db44f3eb547bda9d78b1255d469a36d"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.727777 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" event={"ID":"ceb8b187-9126-4d1e-8201-b4d12a0d1e7a","Type":"ContainerStarted","Data":"4709235971cec61b7a24c7d199ab74dbf661943a97323e6476bab3dd4a41d67e"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.727993 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.752448 4744 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qfxk4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.752496 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" podUID="ceb8b187-9126-4d1e-8201-b4d12a0d1e7a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.753511 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg" event={"ID":"943916ae-78c3-4ff3-8f1b-71c56ad874dd","Type":"ContainerStarted","Data":"52d6e848b585e18da322a9366f2901834a9ee425e3d1d0bd22277e3787f86425"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.754900 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:53 crc kubenswrapper[4744]: E1205 20:12:53.755798 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:54.255786484 +0000 UTC m=+144.485597852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.756890 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz" event={"ID":"361522f8-b0a1-45d2-baa1-9779678fa54f","Type":"ContainerStarted","Data":"803f1479439e2f08d8f4d6e188604f22b5708d6641b49efd30eeb0f57ea508b3"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.758949 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" podStartSLOduration=122.758923044 podStartE2EDuration="2m2.758923044s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:53.757150679 +0000 UTC m=+143.986962047" watchObservedRunningTime="2025-12-05 20:12:53.758923044 +0000 UTC m=+143.988734412" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.759320 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ctwg7" podStartSLOduration=122.759315844 podStartE2EDuration="2m2.759315844s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:53.638530769 +0000 UTC m=+143.868342127" watchObservedRunningTime="2025-12-05 20:12:53.759315844 +0000 UTC m=+143.989127212" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.783754 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" event={"ID":"3b6ef406-5003-4eb6-bf53-3a195fcface8","Type":"ContainerStarted","Data":"dd10c55a4e9f3a0a704285b211898974e5a6a5cdef39b966c06c54b858436873"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.785790 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" event={"ID":"818b6964-1c62-4e2e-8079-a41f9bdcb763","Type":"ContainerStarted","Data":"268e3f05c4bc338cd0e86d727394c7b1aa8b8fb987be7a75e47c3da785315d63"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.786584 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" event={"ID":"59b4fd96-82d8-4cf5-a063-393b6f775e45","Type":"ContainerStarted","Data":"29a47c4ba8b53c3e72ffd67cbc5fe31ad75c51c773dbc410da8332813fac0ea9"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.788992 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" event={"ID":"aff0752e-d15d-4137-a5a4-ed8c29efbc74","Type":"ContainerStarted","Data":"d6b3209386317bd485c311eac9e567da0ad5a888f79315b0be0f5a44588dfbe5"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.792253 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" podStartSLOduration=122.792243031 podStartE2EDuration="2m2.792243031s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:53.790257351 +0000 UTC m=+144.020068719" watchObservedRunningTime="2025-12-05 20:12:53.792243031 +0000 UTC m=+144.022054399" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.803103 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.808067 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nxjnb" event={"ID":"fc23ad84-d2b5-4f8b-a110-143219eb78a9","Type":"ContainerStarted","Data":"c2fe6899b2cf5b44655d8e9934e80ff423d98fcb949508d435312a762224bf5d"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.813443 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" event={"ID":"32fe940e-dd94-4dd9-921c-fcd99ddccb2a","Type":"ContainerStarted","Data":"dd35bcdd2f87c7a8e963889c0affde6073447cbca157aa4d26b333a4064de798"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.820814 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-nxjnb" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.824030 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" event={"ID":"264cec36-f420-4db9-ba83-266f78ecb82d","Type":"ContainerStarted","Data":"d26913a7e68f48109785520f479bc74ae8672d470bad4dff00c3dc4cd0aeee7c"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.826283 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g2wxh" event={"ID":"e9f2d8c0-c11b-4910-aa67-5be21f46b32d","Type":"ContainerStarted","Data":"0f4a229ec7d1eaaf2fa5520916bd9b0b019a097239ca34ac986b9827943dfaab"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.829645 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gw9l6" event={"ID":"2dd0664e-36e7-48d4-bfbe-76cdf69883b6","Type":"ContainerStarted","Data":"7c23bcf61eb521e6fff1f5ffd7fadf59f9d1ee7f6f8fab2a78a688168d447790"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.829663 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gw9l6" event={"ID":"2dd0664e-36e7-48d4-bfbe-76cdf69883b6","Type":"ContainerStarted","Data":"936b3bb5244d86bdd415cd93df582ba42a279c2f7f1472910f435080270058ec"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.830032 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-gw9l6" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.832177 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" podStartSLOduration=122.832158648 podStartE2EDuration="2m2.832158648s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:53.82993386 +0000 UTC m=+144.059745238" watchObservedRunningTime="2025-12-05 20:12:53.832158648 +0000 UTC m=+144.061970016" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.841194 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm" event={"ID":"2a914cea-d605-479e-9f9c-97fedfeddaf4","Type":"ContainerStarted","Data":"d9c95dedb94519b2830210337f50f9adb8ab8115735864b871d650e9f4d1ed02"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.841232 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm" event={"ID":"2a914cea-d605-479e-9f9c-97fedfeddaf4","Type":"ContainerStarted","Data":"44cf2e6dd3ade01d46ef6bfcbeb6ebd52e5329ebed03d4d18dbd4aaac22a3877"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.841243 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm" event={"ID":"2a914cea-d605-479e-9f9c-97fedfeddaf4","Type":"ContainerStarted","Data":"9a9ac2b6d3d10da210a7ba1ed13cd136a036bb2fff830736299cffa9b7890338"} Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.841256 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.858408 4744 patch_prober.go:28] interesting pod/downloads-7954f5f757-gw9l6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.858482 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gw9l6" podUID="2dd0664e-36e7-48d4-bfbe-76cdf69883b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.866808 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:53 crc kubenswrapper[4744]: E1205 20:12:53.868596 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:54.368570214 +0000 UTC m=+144.598381622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.893555 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-r5krf" podStartSLOduration=122.893538455 podStartE2EDuration="2m2.893538455s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:53.893189466 +0000 UTC m=+144.123000834" watchObservedRunningTime="2025-12-05 20:12:53.893538455 +0000 UTC m=+144.123349823" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.893829 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-nxjnb" podStartSLOduration=123.893824283 podStartE2EDuration="2m3.893824283s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:53.853443515 +0000 UTC m=+144.083254883" watchObservedRunningTime="2025-12-05 20:12:53.893824283 +0000 UTC m=+144.123635651" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.938560 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-gw9l6" podStartSLOduration=123.938540042 podStartE2EDuration="2m3.938540042s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:53.937457935 +0000 UTC m=+144.167269323" watchObservedRunningTime="2025-12-05 20:12:53.938540042 +0000 UTC m=+144.168351420" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.984994 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:53 crc kubenswrapper[4744]: I1205 20:12:53.985839 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" podStartSLOduration=123.985817607 podStartE2EDuration="2m3.985817607s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:53.979887805 +0000 UTC m=+144.209699183" watchObservedRunningTime="2025-12-05 20:12:53.985817607 +0000 UTC m=+144.215628975" Dec 05 20:12:53 crc kubenswrapper[4744]: E1205 20:12:53.989905 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:54.489890742 +0000 UTC m=+144.719702110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.031691 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm" podStartSLOduration=123.031677877 podStartE2EDuration="2m3.031677877s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:54.029560443 +0000 UTC m=+144.259371811" watchObservedRunningTime="2025-12-05 20:12:54.031677877 +0000 UTC m=+144.261489245" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.087797 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:54 crc kubenswrapper[4744]: E1205 20:12:54.088103 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:54.588088996 +0000 UTC m=+144.817900364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.108401 4744 patch_prober.go:28] interesting pod/router-default-5444994796-lk6bf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:12:54 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Dec 05 20:12:54 crc kubenswrapper[4744]: [+]process-running ok Dec 05 20:12:54 crc kubenswrapper[4744]: healthz check failed Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.108464 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lk6bf" podUID="aea16266-db6e-4bd6-aac2-8dea60e44c25" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.190312 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:54 crc kubenswrapper[4744]: E1205 20:12:54.190955 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:54.690942561 +0000 UTC m=+144.920753929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.295726 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:54 crc kubenswrapper[4744]: E1205 20:12:54.296218 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:54.796202047 +0000 UTC m=+145.026013415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.296801 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5x6x6"] Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.297714 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.305907 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5x6x6"] Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.306729 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.403045 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db367c1-8f1b-4096-9f23-5a3d14d3980f-utilities\") pod \"community-operators-5x6x6\" (UID: \"2db367c1-8f1b-4096-9f23-5a3d14d3980f\") " pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.403412 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db367c1-8f1b-4096-9f23-5a3d14d3980f-catalog-content\") pod \"community-operators-5x6x6\" (UID: \"2db367c1-8f1b-4096-9f23-5a3d14d3980f\") " pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.403445 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.403513 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-849np\" (UniqueName: \"kubernetes.io/projected/2db367c1-8f1b-4096-9f23-5a3d14d3980f-kube-api-access-849np\") pod \"community-operators-5x6x6\" (UID: \"2db367c1-8f1b-4096-9f23-5a3d14d3980f\") " pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:12:54 crc kubenswrapper[4744]: E1205 20:12:54.403798 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:54.903787483 +0000 UTC m=+145.133598851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.488078 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9sq9s"] Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.489016 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.506172 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.506349 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-849np\" (UniqueName: \"kubernetes.io/projected/2db367c1-8f1b-4096-9f23-5a3d14d3980f-kube-api-access-849np\") pod \"community-operators-5x6x6\" (UID: \"2db367c1-8f1b-4096-9f23-5a3d14d3980f\") " pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.506401 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db367c1-8f1b-4096-9f23-5a3d14d3980f-utilities\") pod \"community-operators-5x6x6\" (UID: \"2db367c1-8f1b-4096-9f23-5a3d14d3980f\") " pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.506426 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db367c1-8f1b-4096-9f23-5a3d14d3980f-catalog-content\") pod \"community-operators-5x6x6\" (UID: \"2db367c1-8f1b-4096-9f23-5a3d14d3980f\") " pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.506795 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db367c1-8f1b-4096-9f23-5a3d14d3980f-catalog-content\") pod \"community-operators-5x6x6\" (UID: \"2db367c1-8f1b-4096-9f23-5a3d14d3980f\") " pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.507011 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db367c1-8f1b-4096-9f23-5a3d14d3980f-utilities\") pod \"community-operators-5x6x6\" (UID: \"2db367c1-8f1b-4096-9f23-5a3d14d3980f\") " pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:12:54 crc kubenswrapper[4744]: E1205 20:12:54.507350 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:55.007315625 +0000 UTC m=+145.237126993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.511665 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.518220 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9sq9s"] Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.542358 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-849np\" (UniqueName: \"kubernetes.io/projected/2db367c1-8f1b-4096-9f23-5a3d14d3980f-kube-api-access-849np\") pod \"community-operators-5x6x6\" (UID: \"2db367c1-8f1b-4096-9f23-5a3d14d3980f\") " pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.609122 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7txn\" (UniqueName: \"kubernetes.io/projected/f76f1c47-c74d-46cb-ad16-db7392a47a9b-kube-api-access-w7txn\") pod \"certified-operators-9sq9s\" (UID: \"f76f1c47-c74d-46cb-ad16-db7392a47a9b\") " pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.609190 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76f1c47-c74d-46cb-ad16-db7392a47a9b-catalog-content\") pod \"certified-operators-9sq9s\" (UID: \"f76f1c47-c74d-46cb-ad16-db7392a47a9b\") " pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.609264 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.609302 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76f1c47-c74d-46cb-ad16-db7392a47a9b-utilities\") pod \"certified-operators-9sq9s\" (UID: \"f76f1c47-c74d-46cb-ad16-db7392a47a9b\") " pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:12:54 crc kubenswrapper[4744]: E1205 20:12:54.609641 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:55.109628395 +0000 UTC m=+145.339439763 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.627352 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.674652 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hpp25"] Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.675727 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.708969 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hpp25"] Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.716212 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.716386 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-utilities\") pod \"community-operators-hpp25\" (UID: \"b06f2a4f-7424-477d-b47a-9d71ce3cdd21\") " pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.716411 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76f1c47-c74d-46cb-ad16-db7392a47a9b-catalog-content\") pod \"certified-operators-9sq9s\" (UID: \"f76f1c47-c74d-46cb-ad16-db7392a47a9b\") " pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.716441 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76f1c47-c74d-46cb-ad16-db7392a47a9b-utilities\") pod \"certified-operators-9sq9s\" (UID: \"f76f1c47-c74d-46cb-ad16-db7392a47a9b\") " pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.716490 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgv2n\" (UniqueName: \"kubernetes.io/projected/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-kube-api-access-mgv2n\") pod \"community-operators-hpp25\" (UID: \"b06f2a4f-7424-477d-b47a-9d71ce3cdd21\") " pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.716523 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7txn\" (UniqueName: \"kubernetes.io/projected/f76f1c47-c74d-46cb-ad16-db7392a47a9b-kube-api-access-w7txn\") pod \"certified-operators-9sq9s\" (UID: \"f76f1c47-c74d-46cb-ad16-db7392a47a9b\") " pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.716545 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-catalog-content\") pod \"community-operators-hpp25\" (UID: \"b06f2a4f-7424-477d-b47a-9d71ce3cdd21\") " pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:12:54 crc kubenswrapper[4744]: E1205 20:12:54.716629 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:55.216616045 +0000 UTC m=+145.446427413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.716961 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76f1c47-c74d-46cb-ad16-db7392a47a9b-catalog-content\") pod \"certified-operators-9sq9s\" (UID: \"f76f1c47-c74d-46cb-ad16-db7392a47a9b\") " pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.717169 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76f1c47-c74d-46cb-ad16-db7392a47a9b-utilities\") pod \"certified-operators-9sq9s\" (UID: \"f76f1c47-c74d-46cb-ad16-db7392a47a9b\") " pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.773402 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7txn\" (UniqueName: \"kubernetes.io/projected/f76f1c47-c74d-46cb-ad16-db7392a47a9b-kube-api-access-w7txn\") pod \"certified-operators-9sq9s\" (UID: \"f76f1c47-c74d-46cb-ad16-db7392a47a9b\") " pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.817380 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgv2n\" (UniqueName: \"kubernetes.io/projected/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-kube-api-access-mgv2n\") pod \"community-operators-hpp25\" (UID: \"b06f2a4f-7424-477d-b47a-9d71ce3cdd21\") " pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.817446 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-catalog-content\") pod \"community-operators-hpp25\" (UID: \"b06f2a4f-7424-477d-b47a-9d71ce3cdd21\") " pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.817496 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-utilities\") pod \"community-operators-hpp25\" (UID: \"b06f2a4f-7424-477d-b47a-9d71ce3cdd21\") " pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.817528 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:54 crc kubenswrapper[4744]: E1205 20:12:54.817785 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:55.317764115 +0000 UTC m=+145.547575483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.818620 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-utilities\") pod \"community-operators-hpp25\" (UID: \"b06f2a4f-7424-477d-b47a-9d71ce3cdd21\") " pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.861323 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:12:54 crc kubenswrapper[4744]: I1205 20:12:54.882341 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" event={"ID":"59b4fd96-82d8-4cf5-a063-393b6f775e45","Type":"ContainerStarted","Data":"cd2721fef8bb4c8c9decca733de2c227811e0d6a9d02f406afde2dd03168efdf"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.884819 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" event={"ID":"264cec36-f420-4db9-ba83-266f78ecb82d","Type":"ContainerStarted","Data":"25c313de6b3a623ed38a27355064d85875128657ff6d0697cf4aebd094bb965e"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.886147 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" event={"ID":"3f9ec60a-e0c3-4a0c-8b43-809eb09fb365","Type":"ContainerStarted","Data":"80d6de3f3eb6c67df381c0f35b3eb59b9d02432266cfacdde96b63dde7dfded0"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.887350 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g2wxh" event={"ID":"e9f2d8c0-c11b-4910-aa67-5be21f46b32d","Type":"ContainerStarted","Data":"76028cea1e36358698a2e07ed6ae13f99993b5d8fdab62df6425426c56c9dc01"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.888694 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-d6kbr" event={"ID":"c8430c27-e731-481d-8579-06bd5c157f2c","Type":"ContainerStarted","Data":"5791e41fc418aea8b623752917188572f193fa554e48c5c412a188b0dfbadad0"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.893325 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz" event={"ID":"361522f8-b0a1-45d2-baa1-9779678fa54f","Type":"ContainerStarted","Data":"93cbeaba2335bc30315bdd4d4763c7953c426bf4f75487f7e4b7c9e905a333f8"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.895699 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" event={"ID":"32fe940e-dd94-4dd9-921c-fcd99ddccb2a","Type":"ContainerStarted","Data":"f24d830f74252c3dfc5d33b4f02f5dd9416dfc8542eaeb5c22e9b19bb9a08491"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.895738 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" event={"ID":"32fe940e-dd94-4dd9-921c-fcd99ddccb2a","Type":"ContainerStarted","Data":"11cbd2a197c732765269ff28ea5bf716f313823e79bce768d44184fe9fcb4b8e"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.897322 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" event={"ID":"a099a621-9515-4776-bc62-12fb0fa62340","Type":"ContainerStarted","Data":"0d1469904e72fd58b38de46b2dd08122fa4a2d6ebebbf22aed0fa542db37afe1"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.908484 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-htzxr" event={"ID":"79d58c0b-affd-462b-b4ee-1134ede8bcb5","Type":"ContainerStarted","Data":"dd185027bbdfa0d60cf120115466219ff855496768a5fca61ab70b2deface976"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.911227 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" event={"ID":"53f9a23a-b663-4cbf-8c34-334f073e3092","Type":"ContainerStarted","Data":"3061cab75472f9e338ec46f53fdc8cb9f67ecf93b423e88b4bbb16f3a36fa105"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.912342 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.933187 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" event={"ID":"476c0833-0a8f-4824-a7fe-6f28aada483b","Type":"ContainerStarted","Data":"5610a9ce5330a185d34e016d49a6bb7b3cd611646e931426f40f771de4dede8a"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.933709 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:55 crc kubenswrapper[4744]: E1205 20:12:54.933978 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:55.433956242 +0000 UTC m=+145.663767610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.934710 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.935745 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:55 crc kubenswrapper[4744]: E1205 20:12:54.937777 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:55.43775994 +0000 UTC m=+145.667571308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.948155 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" event={"ID":"8bfdca92-a782-4806-a2c0-e54302fd24a4","Type":"ContainerStarted","Data":"9c7bf640e7b8f0575889e7daa44788ec557b7392eff3dc1d69a5c390688c4e1d"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.949328 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.950454 4744 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tr74j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.950494 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" podUID="8bfdca92-a782-4806-a2c0-e54302fd24a4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.971441 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" event={"ID":"5b931ded-d187-4535-b266-0d17996f0b27","Type":"ContainerStarted","Data":"1292b254f2515173f4a79b7d6e95a0c1830dca5de4bdfec0f2a1309cc6f37c1e"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.984038 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68" event={"ID":"57ee820b-1f44-41e2-b44b-b6bb25edb5af","Type":"ContainerStarted","Data":"1ada87ce21a6ea849da80235c210c067a6bd968d8a56cfe777c08a030c85d664"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.985631 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" event={"ID":"ceb8b187-9126-4d1e-8201-b4d12a0d1e7a","Type":"ContainerStarted","Data":"9135f514f6ac9d8aad44934671a96b93a82f8e074d59b5f9f0e189750884d982"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.990201 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l" event={"ID":"15306917-0f1c-4f26-9eda-637d43a32172","Type":"ContainerStarted","Data":"161dbce18e177ba6113459efeaebb5061a45549fb06d866a67a1daa58871acd5"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.990226 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l" event={"ID":"15306917-0f1c-4f26-9eda-637d43a32172","Type":"ContainerStarted","Data":"fec03cda867fee904ae701420384cfd559e83db5fdde6099f81839c390a6a65a"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.991281 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8h6p" event={"ID":"55540507-8d49-4b29-8c37-30d340e4eb1b","Type":"ContainerStarted","Data":"21a8241106bb45d03aae6b1a448506b7ac34e0e543ceb47219bf041991784a94"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.992906 4744 generic.go:334] "Generic (PLEG): container finished" podID="aff0752e-d15d-4137-a5a4-ed8c29efbc74" containerID="46397e256a0842e77aec0c8c83a37db420643954bb730c2c1032db14d08d2b18" exitCode=0 Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.992943 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" event={"ID":"aff0752e-d15d-4137-a5a4-ed8c29efbc74","Type":"ContainerStarted","Data":"1c89065e14cd2d821233992f73251280b1fee671224386f645e27d4d724d23b4"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.992955 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" event={"ID":"aff0752e-d15d-4137-a5a4-ed8c29efbc74","Type":"ContainerDied","Data":"46397e256a0842e77aec0c8c83a37db420643954bb730c2c1032db14d08d2b18"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.993332 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.994340 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg" event={"ID":"943916ae-78c3-4ff3-8f1b-71c56ad874dd","Type":"ContainerStarted","Data":"6d1c2095fdf30dd64ca7e96585085fa7569b68c90f95c444815d53631ec8e69c"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.995777 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" event={"ID":"3b6ef406-5003-4eb6-bf53-3a195fcface8","Type":"ContainerStarted","Data":"2796d32432d19d2a3f5c7602959d06f7e28d4220b6ba831cc36fc524c95e088b"} Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.997116 4744 patch_prober.go:28] interesting pod/downloads-7954f5f757-gw9l6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:54.997162 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gw9l6" podUID="2dd0664e-36e7-48d4-bfbe-76cdf69883b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.033257 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-catalog-content\") pod \"community-operators-hpp25\" (UID: \"b06f2a4f-7424-477d-b47a-9d71ce3cdd21\") " pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.033637 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgv2n\" (UniqueName: \"kubernetes.io/projected/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-kube-api-access-mgv2n\") pod \"community-operators-hpp25\" (UID: \"b06f2a4f-7424-477d-b47a-9d71ce3cdd21\") " pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.036844 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:55 crc kubenswrapper[4744]: E1205 20:12:55.037891 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:55.537868624 +0000 UTC m=+145.767680072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.038598 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qfxk4" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.052034 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.058831 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s2ptt"] Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.060127 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.085686 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-l6gl7" podStartSLOduration=124.085669473 podStartE2EDuration="2m4.085669473s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:55.063229116 +0000 UTC m=+145.293040484" watchObservedRunningTime="2025-12-05 20:12:55.085669473 +0000 UTC m=+145.315480841" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.090853 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2ptt"] Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.107773 4744 patch_prober.go:28] interesting pod/router-default-5444994796-lk6bf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:12:55 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Dec 05 20:12:55 crc kubenswrapper[4744]: [+]process-running ok Dec 05 20:12:55 crc kubenswrapper[4744]: healthz check failed Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.107825 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lk6bf" podUID="aea16266-db6e-4bd6-aac2-8dea60e44c25" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.139531 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:55 crc kubenswrapper[4744]: E1205 20:12:55.148565 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:55.64855252 +0000 UTC m=+145.878363888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.172732 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-htzxr" podStartSLOduration=125.172714061 podStartE2EDuration="2m5.172714061s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:55.14273764 +0000 UTC m=+145.372549008" watchObservedRunningTime="2025-12-05 20:12:55.172714061 +0000 UTC m=+145.402525419" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.219625 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-48knz" podStartSLOduration=124.219609556 podStartE2EDuration="2m4.219609556s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:55.171470859 +0000 UTC m=+145.401282227" watchObservedRunningTime="2025-12-05 20:12:55.219609556 +0000 UTC m=+145.449420924" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.219996 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n7t68" podStartSLOduration=125.219992036 podStartE2EDuration="2m5.219992036s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:55.21820042 +0000 UTC m=+145.448011788" watchObservedRunningTime="2025-12-05 20:12:55.219992036 +0000 UTC m=+145.449803404" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.240871 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.241042 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b0787d1-231e-453f-8f0a-09804298f1db-catalog-content\") pod \"certified-operators-s2ptt\" (UID: \"7b0787d1-231e-453f-8f0a-09804298f1db\") " pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.241119 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b0787d1-231e-453f-8f0a-09804298f1db-utilities\") pod \"certified-operators-s2ptt\" (UID: \"7b0787d1-231e-453f-8f0a-09804298f1db\") " pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.241168 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6clxq\" (UniqueName: \"kubernetes.io/projected/7b0787d1-231e-453f-8f0a-09804298f1db-kube-api-access-6clxq\") pod \"certified-operators-s2ptt\" (UID: \"7b0787d1-231e-453f-8f0a-09804298f1db\") " pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:12:55 crc kubenswrapper[4744]: E1205 20:12:55.241263 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:55.741248983 +0000 UTC m=+145.971060351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.338752 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" podStartSLOduration=124.338734739 podStartE2EDuration="2m4.338734739s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:55.335117916 +0000 UTC m=+145.564929284" watchObservedRunningTime="2025-12-05 20:12:55.338734739 +0000 UTC m=+145.568546107" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.343235 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b0787d1-231e-453f-8f0a-09804298f1db-catalog-content\") pod \"certified-operators-s2ptt\" (UID: \"7b0787d1-231e-453f-8f0a-09804298f1db\") " pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.343323 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.343345 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b0787d1-231e-453f-8f0a-09804298f1db-utilities\") pod \"certified-operators-s2ptt\" (UID: \"7b0787d1-231e-453f-8f0a-09804298f1db\") " pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.343387 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6clxq\" (UniqueName: \"kubernetes.io/projected/7b0787d1-231e-453f-8f0a-09804298f1db-kube-api-access-6clxq\") pod \"certified-operators-s2ptt\" (UID: \"7b0787d1-231e-453f-8f0a-09804298f1db\") " pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.344052 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b0787d1-231e-453f-8f0a-09804298f1db-catalog-content\") pod \"certified-operators-s2ptt\" (UID: \"7b0787d1-231e-453f-8f0a-09804298f1db\") " pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:12:55 crc kubenswrapper[4744]: E1205 20:12:55.344278 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:55.844266071 +0000 UTC m=+146.074077439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.344678 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b0787d1-231e-453f-8f0a-09804298f1db-utilities\") pod \"certified-operators-s2ptt\" (UID: \"7b0787d1-231e-453f-8f0a-09804298f1db\") " pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.389586 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6wx2p" podStartSLOduration=125.389574976 podStartE2EDuration="2m5.389574976s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:55.387166554 +0000 UTC m=+145.616977922" watchObservedRunningTime="2025-12-05 20:12:55.389574976 +0000 UTC m=+145.619386344" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.389717 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6clxq\" (UniqueName: \"kubernetes.io/projected/7b0787d1-231e-453f-8f0a-09804298f1db-kube-api-access-6clxq\") pod \"certified-operators-s2ptt\" (UID: \"7b0787d1-231e-453f-8f0a-09804298f1db\") " pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.408938 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5x6x6"] Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.430097 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.430170 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jc4pg" podStartSLOduration=124.430152479 podStartE2EDuration="2m4.430152479s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:55.4297678 +0000 UTC m=+145.659579168" watchObservedRunningTime="2025-12-05 20:12:55.430152479 +0000 UTC m=+145.659963847" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.450226 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:55 crc kubenswrapper[4744]: E1205 20:12:55.450707 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:55.950692817 +0000 UTC m=+146.180504185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.503621 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llkqs" podStartSLOduration=124.503606747 podStartE2EDuration="2m4.503606747s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:55.474052157 +0000 UTC m=+145.703863525" watchObservedRunningTime="2025-12-05 20:12:55.503606747 +0000 UTC m=+145.733418115" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.504953 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" podStartSLOduration=124.504947412 podStartE2EDuration="2m4.504947412s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:55.502862829 +0000 UTC m=+145.732674197" watchObservedRunningTime="2025-12-05 20:12:55.504947412 +0000 UTC m=+145.734758780" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.555705 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:55 crc kubenswrapper[4744]: E1205 20:12:55.556042 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.056029625 +0000 UTC m=+146.285840993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.557821 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" podStartSLOduration=125.557801031 podStartE2EDuration="2m5.557801031s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:55.529413271 +0000 UTC m=+145.759224639" watchObservedRunningTime="2025-12-05 20:12:55.557801031 +0000 UTC m=+145.787612399" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.625408 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cz77l" podStartSLOduration=124.625387908 podStartE2EDuration="2m4.625387908s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:55.559979667 +0000 UTC m=+145.789791035" watchObservedRunningTime="2025-12-05 20:12:55.625387908 +0000 UTC m=+145.855199276" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.664855 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:55 crc kubenswrapper[4744]: E1205 20:12:55.665029 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.165004157 +0000 UTC m=+146.394815525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.665154 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:55 crc kubenswrapper[4744]: E1205 20:12:55.665554 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.165544531 +0000 UTC m=+146.395355899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.672919 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" podStartSLOduration=125.67290662 podStartE2EDuration="2m5.67290662s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:55.625971463 +0000 UTC m=+145.855782851" watchObservedRunningTime="2025-12-05 20:12:55.67290662 +0000 UTC m=+145.902717988" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.736165 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-9hjlq" podStartSLOduration=124.736147525 podStartE2EDuration="2m4.736147525s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:55.718859031 +0000 UTC m=+145.948670399" watchObservedRunningTime="2025-12-05 20:12:55.736147525 +0000 UTC m=+145.965958893" Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.736936 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9sq9s"] Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.768961 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:55 crc kubenswrapper[4744]: E1205 20:12:55.769338 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.269323808 +0000 UTC m=+146.499135176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.870860 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:55 crc kubenswrapper[4744]: E1205 20:12:55.871207 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.371195538 +0000 UTC m=+146.601006906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.971804 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:55 crc kubenswrapper[4744]: E1205 20:12:55.972207 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.472183434 +0000 UTC m=+146.701994802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:55 crc kubenswrapper[4744]: I1205 20:12:55.972383 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:55 crc kubenswrapper[4744]: E1205 20:12:55.972678 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.472670746 +0000 UTC m=+146.702482114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.041111 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-d6kbr" event={"ID":"c8430c27-e731-481d-8579-06bd5c157f2c","Type":"ContainerStarted","Data":"302393044a26a9a0c8a8e74e1b947d758c3626f21947bc0b60a4251d21b49d67"} Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.075249 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:56 crc kubenswrapper[4744]: E1205 20:12:56.075624 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.575607143 +0000 UTC m=+146.805418511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.075933 4744 generic.go:334] "Generic (PLEG): container finished" podID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" containerID="74810375b874abee89ee211b3e99035a23fa85923888b08fab4657ee630b156c" exitCode=0 Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.076003 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x6x6" event={"ID":"2db367c1-8f1b-4096-9f23-5a3d14d3980f","Type":"ContainerDied","Data":"74810375b874abee89ee211b3e99035a23fa85923888b08fab4657ee630b156c"} Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.076026 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x6x6" event={"ID":"2db367c1-8f1b-4096-9f23-5a3d14d3980f","Type":"ContainerStarted","Data":"3305a533fbc19e835c6f9084ca683a4544a5aa935d68f2ec7d230c76caf1a32a"} Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.077905 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.078261 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2ptt"] Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.079143 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-d6kbr" podStartSLOduration=125.079130013 podStartE2EDuration="2m5.079130013s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:56.074997416 +0000 UTC m=+146.304808784" watchObservedRunningTime="2025-12-05 20:12:56.079130013 +0000 UTC m=+146.308941381" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.107251 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" event={"ID":"3f9ec60a-e0c3-4a0c-8b43-809eb09fb365","Type":"ContainerStarted","Data":"21f5d8eb70b6f91badce45268256d53025183628d26f7e5b895c5c791f1c3cc6"} Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.108082 4744 patch_prober.go:28] interesting pod/router-default-5444994796-lk6bf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:12:56 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Dec 05 20:12:56 crc kubenswrapper[4744]: [+]process-running ok Dec 05 20:12:56 crc kubenswrapper[4744]: healthz check failed Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.108112 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lk6bf" podUID="aea16266-db6e-4bd6-aac2-8dea60e44c25" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.108908 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sq9s" event={"ID":"f76f1c47-c74d-46cb-ad16-db7392a47a9b","Type":"ContainerStarted","Data":"c6cd24b6d9af028602bd7a8afb08e21d47a9c986881379099cdf99834623c6b2"} Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.111553 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g2wxh" event={"ID":"e9f2d8c0-c11b-4910-aa67-5be21f46b32d","Type":"ContainerStarted","Data":"0f1d0de56315ec5b986e59a9aafe826cb794f0867b35afb52b77c9c349b106a1"} Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.112249 4744 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tr74j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.112277 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" podUID="8bfdca92-a782-4806-a2c0-e54302fd24a4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.179370 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:56 crc kubenswrapper[4744]: E1205 20:12:56.181092 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.681080894 +0000 UTC m=+146.910892262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.214623 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xtrg9" podStartSLOduration=125.214608826 podStartE2EDuration="2m5.214608826s" podCreationTimestamp="2025-12-05 20:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:56.18910486 +0000 UTC m=+146.418916228" watchObservedRunningTime="2025-12-05 20:12:56.214608826 +0000 UTC m=+146.444420194" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.215600 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-g2wxh" podStartSLOduration=126.215596452 podStartE2EDuration="2m6.215596452s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:56.211274911 +0000 UTC m=+146.441086279" watchObservedRunningTime="2025-12-05 20:12:56.215596452 +0000 UTC m=+146.445407820" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.217961 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hpp25"] Dec 05 20:12:56 crc kubenswrapper[4744]: W1205 20:12:56.281479 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb06f2a4f_7424_477d_b47a_9d71ce3cdd21.slice/crio-9bbb64196873ddc3251234dbe1fa096bd318bf80902ec6d9bd5e5b426d0c8ef2 WatchSource:0}: Error finding container 9bbb64196873ddc3251234dbe1fa096bd318bf80902ec6d9bd5e5b426d0c8ef2: Status 404 returned error can't find the container with id 9bbb64196873ddc3251234dbe1fa096bd318bf80902ec6d9bd5e5b426d0c8ef2 Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.281751 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:56 crc kubenswrapper[4744]: E1205 20:12:56.283111 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.783096367 +0000 UTC m=+147.012907735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.383584 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:56 crc kubenswrapper[4744]: E1205 20:12:56.384191 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.884179485 +0000 UTC m=+147.113990843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.481206 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tclr2"] Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.482097 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.485505 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:56 crc kubenswrapper[4744]: E1205 20:12:56.485940 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:56.985922531 +0000 UTC m=+147.215733899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.486173 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.493132 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tclr2"] Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.587838 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dvtx\" (UniqueName: \"kubernetes.io/projected/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-kube-api-access-5dvtx\") pod \"redhat-marketplace-tclr2\" (UID: \"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8\") " pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.587899 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.587930 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.587982 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.588012 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-catalog-content\") pod \"redhat-marketplace-tclr2\" (UID: \"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8\") " pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.588045 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-utilities\") pod \"redhat-marketplace-tclr2\" (UID: \"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8\") " pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:12:56 crc kubenswrapper[4744]: E1205 20:12:56.588322 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:57.088307563 +0000 UTC m=+147.318118931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.593326 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.612163 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.673369 4744 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.688655 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:56 crc kubenswrapper[4744]: E1205 20:12:56.688804 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:57.188781976 +0000 UTC m=+147.418593334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.688842 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-utilities\") pod \"redhat-marketplace-tclr2\" (UID: \"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8\") " pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.688876 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dvtx\" (UniqueName: \"kubernetes.io/projected/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-kube-api-access-5dvtx\") pod \"redhat-marketplace-tclr2\" (UID: \"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8\") " pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.688948 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.688974 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-catalog-content\") pod \"redhat-marketplace-tclr2\" (UID: \"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8\") " pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:12:56 crc kubenswrapper[4744]: E1205 20:12:56.689339 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:57.18932204 +0000 UTC m=+147.419133408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.689367 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-utilities\") pod \"redhat-marketplace-tclr2\" (UID: \"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8\") " pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.689396 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-catalog-content\") pod \"redhat-marketplace-tclr2\" (UID: \"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8\") " pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.700757 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.712347 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dvtx\" (UniqueName: \"kubernetes.io/projected/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-kube-api-access-5dvtx\") pod \"redhat-marketplace-tclr2\" (UID: \"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8\") " pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.760833 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.761447 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.764420 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.764654 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.791827 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.792585 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.792989 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.793029 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:56 crc kubenswrapper[4744]: E1205 20:12:56.794673 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:57.294629507 +0000 UTC m=+147.524440875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.798782 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.807191 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.874603 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.875421 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2n2x7"] Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.876354 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.894027 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.894092 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47a7dbf8-2b54-40e1-9792-fece61e52edf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"47a7dbf8-2b54-40e1-9792-fece61e52edf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.894115 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47a7dbf8-2b54-40e1-9792-fece61e52edf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"47a7dbf8-2b54-40e1-9792-fece61e52edf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:12:56 crc kubenswrapper[4744]: E1205 20:12:56.894439 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:57.394427543 +0000 UTC m=+147.624238911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.951213 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2n2x7"] Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.999466 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.999624 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-catalog-content\") pod \"redhat-marketplace-2n2x7\" (UID: \"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db\") " pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.999664 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47a7dbf8-2b54-40e1-9792-fece61e52edf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"47a7dbf8-2b54-40e1-9792-fece61e52edf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.999684 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47a7dbf8-2b54-40e1-9792-fece61e52edf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"47a7dbf8-2b54-40e1-9792-fece61e52edf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.999711 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-utilities\") pod \"redhat-marketplace-2n2x7\" (UID: \"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db\") " pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:12:56 crc kubenswrapper[4744]: I1205 20:12:56.999770 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmf9c\" (UniqueName: \"kubernetes.io/projected/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-kube-api-access-pmf9c\") pod \"redhat-marketplace-2n2x7\" (UID: \"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db\") " pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:12:56 crc kubenswrapper[4744]: E1205 20:12:56.999855 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:57.499841063 +0000 UTC m=+147.729652431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.000357 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47a7dbf8-2b54-40e1-9792-fece61e52edf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"47a7dbf8-2b54-40e1-9792-fece61e52edf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.015499 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.039465 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.043097 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47a7dbf8-2b54-40e1-9792-fece61e52edf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"47a7dbf8-2b54-40e1-9792-fece61e52edf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.101105 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.101399 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmf9c\" (UniqueName: \"kubernetes.io/projected/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-kube-api-access-pmf9c\") pod \"redhat-marketplace-2n2x7\" (UID: \"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db\") " pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.101424 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-catalog-content\") pod \"redhat-marketplace-2n2x7\" (UID: \"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db\") " pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.101466 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-utilities\") pod \"redhat-marketplace-2n2x7\" (UID: \"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db\") " pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.101932 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-utilities\") pod \"redhat-marketplace-2n2x7\" (UID: \"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db\") " pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:12:57 crc kubenswrapper[4744]: E1205 20:12:57.102150 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 20:12:57.602139153 +0000 UTC m=+147.831950521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-628ml" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.102704 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-catalog-content\") pod \"redhat-marketplace-2n2x7\" (UID: \"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db\") " pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.102834 4744 patch_prober.go:28] interesting pod/router-default-5444994796-lk6bf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:12:57 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Dec 05 20:12:57 crc kubenswrapper[4744]: [+]process-running ok Dec 05 20:12:57 crc kubenswrapper[4744]: healthz check failed Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.102852 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lk6bf" podUID="aea16266-db6e-4bd6-aac2-8dea60e44c25" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.122980 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmf9c\" (UniqueName: \"kubernetes.io/projected/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-kube-api-access-pmf9c\") pod \"redhat-marketplace-2n2x7\" (UID: \"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db\") " pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.130166 4744 generic.go:334] "Generic (PLEG): container finished" podID="f76f1c47-c74d-46cb-ad16-db7392a47a9b" containerID="e00be5f186decc669c908d316f8433d921596c459b782b0d7f6cc83d4afef2b0" exitCode=0 Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.130353 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sq9s" event={"ID":"f76f1c47-c74d-46cb-ad16-db7392a47a9b","Type":"ContainerDied","Data":"e00be5f186decc669c908d316f8433d921596c459b782b0d7f6cc83d4afef2b0"} Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.141439 4744 generic.go:334] "Generic (PLEG): container finished" podID="b06f2a4f-7424-477d-b47a-9d71ce3cdd21" containerID="6fba76e79ae3216a2b9fcd333c0b4659e7101bfe672c8f79e6e14149bb0d7118" exitCode=0 Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.141528 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpp25" event={"ID":"b06f2a4f-7424-477d-b47a-9d71ce3cdd21","Type":"ContainerDied","Data":"6fba76e79ae3216a2b9fcd333c0b4659e7101bfe672c8f79e6e14149bb0d7118"} Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.141554 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpp25" event={"ID":"b06f2a4f-7424-477d-b47a-9d71ce3cdd21","Type":"ContainerStarted","Data":"9bbb64196873ddc3251234dbe1fa096bd318bf80902ec6d9bd5e5b426d0c8ef2"} Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.145763 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.147184 4744 generic.go:334] "Generic (PLEG): container finished" podID="7b0787d1-231e-453f-8f0a-09804298f1db" containerID="8329f6aba3faec10162b6e9ada8e24903a80b16acbe7950c770240e8fbfac565" exitCode=0 Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.147238 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2ptt" event={"ID":"7b0787d1-231e-453f-8f0a-09804298f1db","Type":"ContainerDied","Data":"8329f6aba3faec10162b6e9ada8e24903a80b16acbe7950c770240e8fbfac565"} Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.147261 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2ptt" event={"ID":"7b0787d1-231e-453f-8f0a-09804298f1db","Type":"ContainerStarted","Data":"57bf9e20bc3bcf918fc084984e9e5a570e87431ae50fac516dc81976976c6153"} Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.157427 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" event={"ID":"3b6ef406-5003-4eb6-bf53-3a195fcface8","Type":"ContainerStarted","Data":"acce63a443ef74ce53aed1b562007980a53c1299aabb734a182ecc94547b4e84"} Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.157457 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" event={"ID":"3b6ef406-5003-4eb6-bf53-3a195fcface8","Type":"ContainerStarted","Data":"16baa69d49a5b9be77eea8b12c44fcda539ab49004fce655867f8298189f9592"} Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.166614 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.219573 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:57 crc kubenswrapper[4744]: E1205 20:12:57.221418 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 20:12:57.721397939 +0000 UTC m=+147.951209307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.224039 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.240698 4744 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-05T20:12:56.673403221Z","Handler":null,"Name":""} Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.255960 4744 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.255991 4744 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.323065 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.340349 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.340389 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.365921 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.366325 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.386021 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-628ml\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.392671 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.425597 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.430920 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.465716 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.469682 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dd7w8"] Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.470623 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.474624 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.484805 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.484845 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.490370 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dd7w8"] Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.502589 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhpng" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.511884 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:57 crc kubenswrapper[4744]: W1205 20:12:57.589856 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-21897c49a3c20132afbe519d2598c38f1dcb09a48835621fe63a03c67642d4ad WatchSource:0}: Error finding container 21897c49a3c20132afbe519d2598c38f1dcb09a48835621fe63a03c67642d4ad: Status 404 returned error can't find the container with id 21897c49a3c20132afbe519d2598c38f1dcb09a48835621fe63a03c67642d4ad Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.629107 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/649cba80-0f59-449e-8a48-fbb1b4d373e3-utilities\") pod \"redhat-operators-dd7w8\" (UID: \"649cba80-0f59-449e-8a48-fbb1b4d373e3\") " pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.629161 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/649cba80-0f59-449e-8a48-fbb1b4d373e3-catalog-content\") pod \"redhat-operators-dd7w8\" (UID: \"649cba80-0f59-449e-8a48-fbb1b4d373e3\") " pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.629189 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqqpw\" (UniqueName: \"kubernetes.io/projected/649cba80-0f59-449e-8a48-fbb1b4d373e3-kube-api-access-lqqpw\") pod \"redhat-operators-dd7w8\" (UID: \"649cba80-0f59-449e-8a48-fbb1b4d373e3\") " pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.646464 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tclr2"] Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.673590 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.677562 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7fvql"] Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.678730 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:12:57 crc kubenswrapper[4744]: W1205 20:12:57.690855 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf07b8700_0120_4aa2_bd07_8a6f06d8dbf8.slice/crio-a8e58c6a9239fb2131c05a950afcdc44b13b62007adb69fbb842e1cdd00360ae WatchSource:0}: Error finding container a8e58c6a9239fb2131c05a950afcdc44b13b62007adb69fbb842e1cdd00360ae: Status 404 returned error can't find the container with id a8e58c6a9239fb2131c05a950afcdc44b13b62007adb69fbb842e1cdd00360ae Dec 05 20:12:57 crc kubenswrapper[4744]: W1205 20:12:57.702059 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod47a7dbf8_2b54_40e1_9792_fece61e52edf.slice/crio-a578b25a78d5a533fe3c50a3b0d110709446b05c273ec4863c64f822b79bdd3f WatchSource:0}: Error finding container a578b25a78d5a533fe3c50a3b0d110709446b05c273ec4863c64f822b79bdd3f: Status 404 returned error can't find the container with id a578b25a78d5a533fe3c50a3b0d110709446b05c273ec4863c64f822b79bdd3f Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.725501 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7fvql"] Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.743250 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/649cba80-0f59-449e-8a48-fbb1b4d373e3-utilities\") pod \"redhat-operators-dd7w8\" (UID: \"649cba80-0f59-449e-8a48-fbb1b4d373e3\") " pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.743316 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/649cba80-0f59-449e-8a48-fbb1b4d373e3-catalog-content\") pod \"redhat-operators-dd7w8\" (UID: \"649cba80-0f59-449e-8a48-fbb1b4d373e3\") " pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.743339 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqqpw\" (UniqueName: \"kubernetes.io/projected/649cba80-0f59-449e-8a48-fbb1b4d373e3-kube-api-access-lqqpw\") pod \"redhat-operators-dd7w8\" (UID: \"649cba80-0f59-449e-8a48-fbb1b4d373e3\") " pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.744048 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/649cba80-0f59-449e-8a48-fbb1b4d373e3-utilities\") pod \"redhat-operators-dd7w8\" (UID: \"649cba80-0f59-449e-8a48-fbb1b4d373e3\") " pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.744326 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/649cba80-0f59-449e-8a48-fbb1b4d373e3-catalog-content\") pod \"redhat-operators-dd7w8\" (UID: \"649cba80-0f59-449e-8a48-fbb1b4d373e3\") " pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.746684 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2n2x7"] Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.797100 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqqpw\" (UniqueName: \"kubernetes.io/projected/649cba80-0f59-449e-8a48-fbb1b4d373e3-kube-api-access-lqqpw\") pod \"redhat-operators-dd7w8\" (UID: \"649cba80-0f59-449e-8a48-fbb1b4d373e3\") " pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.799846 4744 patch_prober.go:28] interesting pod/downloads-7954f5f757-gw9l6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.799881 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gw9l6" podUID="2dd0664e-36e7-48d4-bfbe-76cdf69883b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.799976 4744 patch_prober.go:28] interesting pod/downloads-7954f5f757-gw9l6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.800020 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-gw9l6" podUID="2dd0664e-36e7-48d4-bfbe-76cdf69883b6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.808543 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-628ml"] Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.847012 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-utilities\") pod \"redhat-operators-7fvql\" (UID: \"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4\") " pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.847150 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-catalog-content\") pod \"redhat-operators-7fvql\" (UID: \"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4\") " pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.847230 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd8v4\" (UniqueName: \"kubernetes.io/projected/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-kube-api-access-bd8v4\") pod \"redhat-operators-7fvql\" (UID: \"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4\") " pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:12:57 crc kubenswrapper[4744]: W1205 20:12:57.896758 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-409dda82538da9577ccff1704ddd86103dcf4cf2893486218f5e642b94b3fd07 WatchSource:0}: Error finding container 409dda82538da9577ccff1704ddd86103dcf4cf2893486218f5e642b94b3fd07: Status 404 returned error can't find the container with id 409dda82538da9577ccff1704ddd86103dcf4cf2893486218f5e642b94b3fd07 Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.948884 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-utilities\") pod \"redhat-operators-7fvql\" (UID: \"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4\") " pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.949332 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-catalog-content\") pod \"redhat-operators-7fvql\" (UID: \"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4\") " pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.949385 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd8v4\" (UniqueName: \"kubernetes.io/projected/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-kube-api-access-bd8v4\") pod \"redhat-operators-7fvql\" (UID: \"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4\") " pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.949412 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-utilities\") pod \"redhat-operators-7fvql\" (UID: \"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4\") " pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.949639 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-catalog-content\") pod \"redhat-operators-7fvql\" (UID: \"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4\") " pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:12:57 crc kubenswrapper[4744]: I1205 20:12:57.970151 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd8v4\" (UniqueName: \"kubernetes.io/projected/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-kube-api-access-bd8v4\") pod \"redhat-operators-7fvql\" (UID: \"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4\") " pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.062570 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.091616 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.101422 4744 patch_prober.go:28] interesting pod/router-default-5444994796-lk6bf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:12:58 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Dec 05 20:12:58 crc kubenswrapper[4744]: [+]process-running ok Dec 05 20:12:58 crc kubenswrapper[4744]: healthz check failed Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.101468 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lk6bf" podUID="aea16266-db6e-4bd6-aac2-8dea60e44c25" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.111486 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.112047 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.212529 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"278d2a97ee410631d363552ceec4ebae3f9287411127cfcf681139caa35289ec"} Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.212787 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"38901581915c5094e3e62020f65bf4fe274f5fa869edff718e2b66991110d3ad"} Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.240552 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"47a7dbf8-2b54-40e1-9792-fece61e52edf","Type":"ContainerStarted","Data":"a578b25a78d5a533fe3c50a3b0d110709446b05c273ec4863c64f822b79bdd3f"} Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.278266 4744 generic.go:334] "Generic (PLEG): container finished" podID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" containerID="d9c6600bb5bf76bb8f0503b2e714255940fe1c0ad630b9316239315853a73edf" exitCode=0 Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.278586 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tclr2" event={"ID":"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8","Type":"ContainerDied","Data":"d9c6600bb5bf76bb8f0503b2e714255940fe1c0ad630b9316239315853a73edf"} Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.278625 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tclr2" event={"ID":"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8","Type":"ContainerStarted","Data":"a8e58c6a9239fb2131c05a950afcdc44b13b62007adb69fbb842e1cdd00360ae"} Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.284612 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-628ml" event={"ID":"98e5f65e-632c-4932-83cc-413ea5cac23a","Type":"ContainerStarted","Data":"bf768afeefae123d6258722c487cd5f64a0e76c00ad25d9ced41b23a3071cb8a"} Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.284642 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-628ml" event={"ID":"98e5f65e-632c-4932-83cc-413ea5cac23a","Type":"ContainerStarted","Data":"340cb9599d2a21e3c856eac949c4ac0763252c4ec8a3dbf1588903c194cdda26"} Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.285256 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.298696 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.298814 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.302456 4744 patch_prober.go:28] interesting pod/console-f9d7485db-lgg2b container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.31:8443/health\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.302509 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lgg2b" podUID="55e3c0e4-3a89-48b0-a218-f89546c09a5d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.31:8443/health\": dial tcp 10.217.0.31:8443: connect: connection refused" Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.323276 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" event={"ID":"3b6ef406-5003-4eb6-bf53-3a195fcface8","Type":"ContainerStarted","Data":"67788bbce6fd494f5c800498895d02471ed8957d501eba6a6efa4d9dad277e7f"} Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.326483 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8097029cc03becdc4ddf312fe6896ee19d9dd39776589763bb302b1a826ca7d4"} Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.326520 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"21897c49a3c20132afbe519d2598c38f1dcb09a48835621fe63a03c67642d4ad"} Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.326926 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.334212 4744 generic.go:334] "Generic (PLEG): container finished" podID="ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" containerID="37f3b648fa2a1567c53eba4da3d563526bd269ed87933facc699f9cbcd647b2e" exitCode=0 Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.334281 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2n2x7" event={"ID":"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db","Type":"ContainerDied","Data":"37f3b648fa2a1567c53eba4da3d563526bd269ed87933facc699f9cbcd647b2e"} Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.334369 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2n2x7" event={"ID":"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db","Type":"ContainerStarted","Data":"e024676422bf2c0764ac1923c6ce7ab9d8dd57030dd290474a75fa5789228a49"} Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.337604 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-628ml" podStartSLOduration=128.337584994 podStartE2EDuration="2m8.337584994s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:58.336211148 +0000 UTC m=+148.566022516" watchObservedRunningTime="2025-12-05 20:12:58.337584994 +0000 UTC m=+148.567396372" Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.344791 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f8c4b2701973970551e7b36a7b93fee8e202949387b38c5756184ce08577f614"} Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.344841 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"409dda82538da9577ccff1704ddd86103dcf4cf2893486218f5e642b94b3fd07"} Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.352641 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-htzxr" Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.353985 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtnpd" Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.394954 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7fvql"] Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.405456 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-9wgfm" podStartSLOduration=13.405435998 podStartE2EDuration="13.405435998s" podCreationTimestamp="2025-12-05 20:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:12:58.404840833 +0000 UTC m=+148.634652201" watchObservedRunningTime="2025-12-05 20:12:58.405435998 +0000 UTC m=+148.635247366" Dec 05 20:12:58 crc kubenswrapper[4744]: I1205 20:12:58.648728 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dd7w8"] Dec 05 20:12:59 crc kubenswrapper[4744]: I1205 20:12:59.042889 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:12:59 crc kubenswrapper[4744]: I1205 20:12:59.100740 4744 patch_prober.go:28] interesting pod/router-default-5444994796-lk6bf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:12:59 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Dec 05 20:12:59 crc kubenswrapper[4744]: [+]process-running ok Dec 05 20:12:59 crc kubenswrapper[4744]: healthz check failed Dec 05 20:12:59 crc kubenswrapper[4744]: I1205 20:12:59.100804 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lk6bf" podUID="aea16266-db6e-4bd6-aac2-8dea60e44c25" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:12:59 crc kubenswrapper[4744]: I1205 20:12:59.361660 4744 generic.go:334] "Generic (PLEG): container finished" podID="649cba80-0f59-449e-8a48-fbb1b4d373e3" containerID="f827d7b888ca1031c0186ebd688af3ff5d7821d02f48a0851beb75c12709efb2" exitCode=0 Dec 05 20:12:59 crc kubenswrapper[4744]: I1205 20:12:59.361730 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd7w8" event={"ID":"649cba80-0f59-449e-8a48-fbb1b4d373e3","Type":"ContainerDied","Data":"f827d7b888ca1031c0186ebd688af3ff5d7821d02f48a0851beb75c12709efb2"} Dec 05 20:12:59 crc kubenswrapper[4744]: I1205 20:12:59.361756 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd7w8" event={"ID":"649cba80-0f59-449e-8a48-fbb1b4d373e3","Type":"ContainerStarted","Data":"8e1d333ec24ef0f19ad98a76a30bb0176804b1cf36e3a133990ddcb25c320b12"} Dec 05 20:12:59 crc kubenswrapper[4744]: I1205 20:12:59.371533 4744 generic.go:334] "Generic (PLEG): container finished" podID="264cec36-f420-4db9-ba83-266f78ecb82d" containerID="25c313de6b3a623ed38a27355064d85875128657ff6d0697cf4aebd094bb965e" exitCode=0 Dec 05 20:12:59 crc kubenswrapper[4744]: I1205 20:12:59.371652 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" event={"ID":"264cec36-f420-4db9-ba83-266f78ecb82d","Type":"ContainerDied","Data":"25c313de6b3a623ed38a27355064d85875128657ff6d0697cf4aebd094bb965e"} Dec 05 20:12:59 crc kubenswrapper[4744]: I1205 20:12:59.378101 4744 generic.go:334] "Generic (PLEG): container finished" podID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" containerID="9b77022da73ab10f9c3be2e48daf1f31928f40d36db4ef4defc311f4f2296f49" exitCode=0 Dec 05 20:12:59 crc kubenswrapper[4744]: I1205 20:12:59.378150 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fvql" event={"ID":"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4","Type":"ContainerDied","Data":"9b77022da73ab10f9c3be2e48daf1f31928f40d36db4ef4defc311f4f2296f49"} Dec 05 20:12:59 crc kubenswrapper[4744]: I1205 20:12:59.378252 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fvql" event={"ID":"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4","Type":"ContainerStarted","Data":"769b3fec5fad69db94dd21a88a1ee77cb16728ed701166920dacbe7e37ddaf36"} Dec 05 20:12:59 crc kubenswrapper[4744]: I1205 20:12:59.384340 4744 generic.go:334] "Generic (PLEG): container finished" podID="47a7dbf8-2b54-40e1-9792-fece61e52edf" containerID="9603b55ef27ce84ef5f2eaf586a0262972a050e15f080a3f25b9e6da33ae118b" exitCode=0 Dec 05 20:12:59 crc kubenswrapper[4744]: I1205 20:12:59.385515 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"47a7dbf8-2b54-40e1-9792-fece61e52edf","Type":"ContainerDied","Data":"9603b55ef27ce84ef5f2eaf586a0262972a050e15f080a3f25b9e6da33ae118b"} Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.101463 4744 patch_prober.go:28] interesting pod/router-default-5444994796-lk6bf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:13:00 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Dec 05 20:13:00 crc kubenswrapper[4744]: [+]process-running ok Dec 05 20:13:00 crc kubenswrapper[4744]: healthz check failed Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.101524 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lk6bf" podUID="aea16266-db6e-4bd6-aac2-8dea60e44c25" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.759107 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.774876 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.848597 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47a7dbf8-2b54-40e1-9792-fece61e52edf-kubelet-dir\") pod \"47a7dbf8-2b54-40e1-9792-fece61e52edf\" (UID: \"47a7dbf8-2b54-40e1-9792-fece61e52edf\") " Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.848664 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47a7dbf8-2b54-40e1-9792-fece61e52edf-kube-api-access\") pod \"47a7dbf8-2b54-40e1-9792-fece61e52edf\" (UID: \"47a7dbf8-2b54-40e1-9792-fece61e52edf\") " Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.848783 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/264cec36-f420-4db9-ba83-266f78ecb82d-secret-volume\") pod \"264cec36-f420-4db9-ba83-266f78ecb82d\" (UID: \"264cec36-f420-4db9-ba83-266f78ecb82d\") " Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.848814 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/264cec36-f420-4db9-ba83-266f78ecb82d-config-volume\") pod \"264cec36-f420-4db9-ba83-266f78ecb82d\" (UID: \"264cec36-f420-4db9-ba83-266f78ecb82d\") " Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.848850 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r74w7\" (UniqueName: \"kubernetes.io/projected/264cec36-f420-4db9-ba83-266f78ecb82d-kube-api-access-r74w7\") pod \"264cec36-f420-4db9-ba83-266f78ecb82d\" (UID: \"264cec36-f420-4db9-ba83-266f78ecb82d\") " Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.850050 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47a7dbf8-2b54-40e1-9792-fece61e52edf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "47a7dbf8-2b54-40e1-9792-fece61e52edf" (UID: "47a7dbf8-2b54-40e1-9792-fece61e52edf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.850541 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/264cec36-f420-4db9-ba83-266f78ecb82d-config-volume" (OuterVolumeSpecName: "config-volume") pod "264cec36-f420-4db9-ba83-266f78ecb82d" (UID: "264cec36-f420-4db9-ba83-266f78ecb82d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.855912 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/264cec36-f420-4db9-ba83-266f78ecb82d-kube-api-access-r74w7" (OuterVolumeSpecName: "kube-api-access-r74w7") pod "264cec36-f420-4db9-ba83-266f78ecb82d" (UID: "264cec36-f420-4db9-ba83-266f78ecb82d"). InnerVolumeSpecName "kube-api-access-r74w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.856463 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47a7dbf8-2b54-40e1-9792-fece61e52edf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "47a7dbf8-2b54-40e1-9792-fece61e52edf" (UID: "47a7dbf8-2b54-40e1-9792-fece61e52edf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.857989 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264cec36-f420-4db9-ba83-266f78ecb82d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "264cec36-f420-4db9-ba83-266f78ecb82d" (UID: "264cec36-f420-4db9-ba83-266f78ecb82d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.950702 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/264cec36-f420-4db9-ba83-266f78ecb82d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.950927 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/264cec36-f420-4db9-ba83-266f78ecb82d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.950944 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r74w7\" (UniqueName: \"kubernetes.io/projected/264cec36-f420-4db9-ba83-266f78ecb82d-kube-api-access-r74w7\") on node \"crc\" DevicePath \"\"" Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.950954 4744 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47a7dbf8-2b54-40e1-9792-fece61e52edf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:13:00 crc kubenswrapper[4744]: I1205 20:13:00.950961 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47a7dbf8-2b54-40e1-9792-fece61e52edf-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:13:01 crc kubenswrapper[4744]: I1205 20:13:01.101725 4744 patch_prober.go:28] interesting pod/router-default-5444994796-lk6bf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 20:13:01 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Dec 05 20:13:01 crc kubenswrapper[4744]: [+]process-running ok Dec 05 20:13:01 crc kubenswrapper[4744]: healthz check failed Dec 05 20:13:01 crc kubenswrapper[4744]: I1205 20:13:01.101776 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lk6bf" podUID="aea16266-db6e-4bd6-aac2-8dea60e44c25" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 20:13:01 crc kubenswrapper[4744]: I1205 20:13:01.445763 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" event={"ID":"264cec36-f420-4db9-ba83-266f78ecb82d","Type":"ContainerDied","Data":"d26913a7e68f48109785520f479bc74ae8672d470bad4dff00c3dc4cd0aeee7c"} Dec 05 20:13:01 crc kubenswrapper[4744]: I1205 20:13:01.445804 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d26913a7e68f48109785520f479bc74ae8672d470bad4dff00c3dc4cd0aeee7c" Dec 05 20:13:01 crc kubenswrapper[4744]: I1205 20:13:01.445900 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c" Dec 05 20:13:01 crc kubenswrapper[4744]: I1205 20:13:01.478112 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"47a7dbf8-2b54-40e1-9792-fece61e52edf","Type":"ContainerDied","Data":"a578b25a78d5a533fe3c50a3b0d110709446b05c273ec4863c64f822b79bdd3f"} Dec 05 20:13:01 crc kubenswrapper[4744]: I1205 20:13:01.478154 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a578b25a78d5a533fe3c50a3b0d110709446b05c273ec4863c64f822b79bdd3f" Dec 05 20:13:01 crc kubenswrapper[4744]: I1205 20:13:01.478223 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 20:13:02 crc kubenswrapper[4744]: I1205 20:13:02.111771 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:13:02 crc kubenswrapper[4744]: I1205 20:13:02.114695 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-lk6bf" Dec 05 20:13:02 crc kubenswrapper[4744]: I1205 20:13:02.842180 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6f79s" Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.597636 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 20:13:03 crc kubenswrapper[4744]: E1205 20:13:03.597889 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a7dbf8-2b54-40e1-9792-fece61e52edf" containerName="pruner" Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.597900 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a7dbf8-2b54-40e1-9792-fece61e52edf" containerName="pruner" Dec 05 20:13:03 crc kubenswrapper[4744]: E1205 20:13:03.597918 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="264cec36-f420-4db9-ba83-266f78ecb82d" containerName="collect-profiles" Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.597924 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="264cec36-f420-4db9-ba83-266f78ecb82d" containerName="collect-profiles" Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.598022 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="47a7dbf8-2b54-40e1-9792-fece61e52edf" containerName="pruner" Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.598035 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="264cec36-f420-4db9-ba83-266f78ecb82d" containerName="collect-profiles" Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.598426 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.600446 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.601120 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.601739 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.751895 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cee6abd8-d267-48c1-a79c-a90ce5950fc8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cee6abd8-d267-48c1-a79c-a90ce5950fc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.752015 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cee6abd8-d267-48c1-a79c-a90ce5950fc8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cee6abd8-d267-48c1-a79c-a90ce5950fc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.854127 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cee6abd8-d267-48c1-a79c-a90ce5950fc8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cee6abd8-d267-48c1-a79c-a90ce5950fc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.854409 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cee6abd8-d267-48c1-a79c-a90ce5950fc8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cee6abd8-d267-48c1-a79c-a90ce5950fc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.854539 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cee6abd8-d267-48c1-a79c-a90ce5950fc8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cee6abd8-d267-48c1-a79c-a90ce5950fc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.874115 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cee6abd8-d267-48c1-a79c-a90ce5950fc8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cee6abd8-d267-48c1-a79c-a90ce5950fc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:13:03 crc kubenswrapper[4744]: I1205 20:13:03.959363 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:13:04 crc kubenswrapper[4744]: I1205 20:13:04.307057 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 20:13:04 crc kubenswrapper[4744]: I1205 20:13:04.510197 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cee6abd8-d267-48c1-a79c-a90ce5950fc8","Type":"ContainerStarted","Data":"a23f5f2d9706d9f53061c4f8b4971300da11a88ecdae065bc5f0cd856692802c"} Dec 05 20:13:04 crc kubenswrapper[4744]: I1205 20:13:04.516313 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-h777m_476c0833-0a8f-4824-a7fe-6f28aada483b/cluster-samples-operator/0.log" Dec 05 20:13:04 crc kubenswrapper[4744]: I1205 20:13:04.516605 4744 generic.go:334] "Generic (PLEG): container finished" podID="476c0833-0a8f-4824-a7fe-6f28aada483b" containerID="f018f5710b57407bb4d7ad9e142431edfdec6bedb229e8163750772064d2514a" exitCode=2 Dec 05 20:13:04 crc kubenswrapper[4744]: I1205 20:13:04.516628 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" event={"ID":"476c0833-0a8f-4824-a7fe-6f28aada483b","Type":"ContainerDied","Data":"f018f5710b57407bb4d7ad9e142431edfdec6bedb229e8163750772064d2514a"} Dec 05 20:13:04 crc kubenswrapper[4744]: I1205 20:13:04.517106 4744 scope.go:117] "RemoveContainer" containerID="f018f5710b57407bb4d7ad9e142431edfdec6bedb229e8163750772064d2514a" Dec 05 20:13:05 crc kubenswrapper[4744]: I1205 20:13:05.533590 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-h777m_476c0833-0a8f-4824-a7fe-6f28aada483b/cluster-samples-operator/0.log" Dec 05 20:13:05 crc kubenswrapper[4744]: I1205 20:13:05.533786 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" event={"ID":"476c0833-0a8f-4824-a7fe-6f28aada483b","Type":"ContainerStarted","Data":"c0ea4b844ce57a13528a25bf2fec4140211abd1ff218fc695240b91052ca1791"} Dec 05 20:13:05 crc kubenswrapper[4744]: I1205 20:13:05.536729 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cee6abd8-d267-48c1-a79c-a90ce5950fc8","Type":"ContainerStarted","Data":"d8e046fcf1b30ada067a8efa65c68470dfe4ca486ebd2420ae3af897d8a465af"} Dec 05 20:13:06 crc kubenswrapper[4744]: I1205 20:13:06.546386 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-h777m_476c0833-0a8f-4824-a7fe-6f28aada483b/cluster-samples-operator/1.log" Dec 05 20:13:06 crc kubenswrapper[4744]: I1205 20:13:06.548144 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-h777m_476c0833-0a8f-4824-a7fe-6f28aada483b/cluster-samples-operator/0.log" Dec 05 20:13:06 crc kubenswrapper[4744]: I1205 20:13:06.548196 4744 generic.go:334] "Generic (PLEG): container finished" podID="476c0833-0a8f-4824-a7fe-6f28aada483b" containerID="c0ea4b844ce57a13528a25bf2fec4140211abd1ff218fc695240b91052ca1791" exitCode=2 Dec 05 20:13:06 crc kubenswrapper[4744]: I1205 20:13:06.548258 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" event={"ID":"476c0833-0a8f-4824-a7fe-6f28aada483b","Type":"ContainerDied","Data":"c0ea4b844ce57a13528a25bf2fec4140211abd1ff218fc695240b91052ca1791"} Dec 05 20:13:06 crc kubenswrapper[4744]: I1205 20:13:06.548312 4744 scope.go:117] "RemoveContainer" containerID="f018f5710b57407bb4d7ad9e142431edfdec6bedb229e8163750772064d2514a" Dec 05 20:13:06 crc kubenswrapper[4744]: I1205 20:13:06.548858 4744 scope.go:117] "RemoveContainer" containerID="c0ea4b844ce57a13528a25bf2fec4140211abd1ff218fc695240b91052ca1791" Dec 05 20:13:06 crc kubenswrapper[4744]: E1205 20:13:06.549254 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-samples-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-samples-operator pod=cluster-samples-operator-665b6dd947-h777m_openshift-cluster-samples-operator(476c0833-0a8f-4824-a7fe-6f28aada483b)\"" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" podUID="476c0833-0a8f-4824-a7fe-6f28aada483b" Dec 05 20:13:06 crc kubenswrapper[4744]: I1205 20:13:06.569916 4744 generic.go:334] "Generic (PLEG): container finished" podID="cee6abd8-d267-48c1-a79c-a90ce5950fc8" containerID="d8e046fcf1b30ada067a8efa65c68470dfe4ca486ebd2420ae3af897d8a465af" exitCode=0 Dec 05 20:13:06 crc kubenswrapper[4744]: I1205 20:13:06.569967 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cee6abd8-d267-48c1-a79c-a90ce5950fc8","Type":"ContainerDied","Data":"d8e046fcf1b30ada067a8efa65c68470dfe4ca486ebd2420ae3af897d8a465af"} Dec 05 20:13:07 crc kubenswrapper[4744]: I1205 20:13:07.806123 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-gw9l6" Dec 05 20:13:08 crc kubenswrapper[4744]: I1205 20:13:08.306555 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:13:08 crc kubenswrapper[4744]: I1205 20:13:08.310064 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:13:12 crc kubenswrapper[4744]: I1205 20:13:12.872626 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs\") pod \"network-metrics-daemon-cgjbb\" (UID: \"9d0c84c8-b581-47ce-8cb8-956d3ef79238\") " pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:13:12 crc kubenswrapper[4744]: I1205 20:13:12.884040 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d0c84c8-b581-47ce-8cb8-956d3ef79238-metrics-certs\") pod \"network-metrics-daemon-cgjbb\" (UID: \"9d0c84c8-b581-47ce-8cb8-956d3ef79238\") " pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:13:12 crc kubenswrapper[4744]: I1205 20:13:12.925869 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cgjbb" Dec 05 20:13:14 crc kubenswrapper[4744]: I1205 20:13:14.012731 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:13:14 crc kubenswrapper[4744]: I1205 20:13:14.089375 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cee6abd8-d267-48c1-a79c-a90ce5950fc8-kube-api-access\") pod \"cee6abd8-d267-48c1-a79c-a90ce5950fc8\" (UID: \"cee6abd8-d267-48c1-a79c-a90ce5950fc8\") " Dec 05 20:13:14 crc kubenswrapper[4744]: I1205 20:13:14.089472 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cee6abd8-d267-48c1-a79c-a90ce5950fc8-kubelet-dir\") pod \"cee6abd8-d267-48c1-a79c-a90ce5950fc8\" (UID: \"cee6abd8-d267-48c1-a79c-a90ce5950fc8\") " Dec 05 20:13:14 crc kubenswrapper[4744]: I1205 20:13:14.089664 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee6abd8-d267-48c1-a79c-a90ce5950fc8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cee6abd8-d267-48c1-a79c-a90ce5950fc8" (UID: "cee6abd8-d267-48c1-a79c-a90ce5950fc8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:13:14 crc kubenswrapper[4744]: I1205 20:13:14.089991 4744 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cee6abd8-d267-48c1-a79c-a90ce5950fc8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:13:14 crc kubenswrapper[4744]: I1205 20:13:14.094653 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee6abd8-d267-48c1-a79c-a90ce5950fc8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cee6abd8-d267-48c1-a79c-a90ce5950fc8" (UID: "cee6abd8-d267-48c1-a79c-a90ce5950fc8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:13:14 crc kubenswrapper[4744]: I1205 20:13:14.191484 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cee6abd8-d267-48c1-a79c-a90ce5950fc8-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:13:14 crc kubenswrapper[4744]: I1205 20:13:14.646279 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cee6abd8-d267-48c1-a79c-a90ce5950fc8","Type":"ContainerDied","Data":"a23f5f2d9706d9f53061c4f8b4971300da11a88ecdae065bc5f0cd856692802c"} Dec 05 20:13:14 crc kubenswrapper[4744]: I1205 20:13:14.646333 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a23f5f2d9706d9f53061c4f8b4971300da11a88ecdae065bc5f0cd856692802c" Dec 05 20:13:14 crc kubenswrapper[4744]: I1205 20:13:14.646361 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 20:13:17 crc kubenswrapper[4744]: I1205 20:13:17.476269 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:13:19 crc kubenswrapper[4744]: I1205 20:13:19.807713 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:13:19 crc kubenswrapper[4744]: I1205 20:13:19.807798 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:13:20 crc kubenswrapper[4744]: I1205 20:13:20.083480 4744 scope.go:117] "RemoveContainer" containerID="c0ea4b844ce57a13528a25bf2fec4140211abd1ff218fc695240b91052ca1791" Dec 05 20:13:27 crc kubenswrapper[4744]: I1205 20:13:27.102366 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 20:13:28 crc kubenswrapper[4744]: I1205 20:13:28.029329 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8z2gm" Dec 05 20:13:35 crc kubenswrapper[4744]: E1205 20:13:35.292702 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 20:13:35 crc kubenswrapper[4744]: E1205 20:13:35.293943 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lqqpw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-dd7w8_openshift-marketplace(649cba80-0f59-449e-8a48-fbb1b4d373e3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:13:35 crc kubenswrapper[4744]: E1205 20:13:35.296271 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-dd7w8" podUID="649cba80-0f59-449e-8a48-fbb1b4d373e3" Dec 05 20:13:36 crc kubenswrapper[4744]: I1205 20:13:36.191245 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 20:13:36 crc kubenswrapper[4744]: E1205 20:13:36.191540 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee6abd8-d267-48c1-a79c-a90ce5950fc8" containerName="pruner" Dec 05 20:13:36 crc kubenswrapper[4744]: I1205 20:13:36.192560 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee6abd8-d267-48c1-a79c-a90ce5950fc8" containerName="pruner" Dec 05 20:13:36 crc kubenswrapper[4744]: I1205 20:13:36.192852 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee6abd8-d267-48c1-a79c-a90ce5950fc8" containerName="pruner" Dec 05 20:13:36 crc kubenswrapper[4744]: I1205 20:13:36.193518 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:13:36 crc kubenswrapper[4744]: I1205 20:13:36.198966 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 20:13:36 crc kubenswrapper[4744]: I1205 20:13:36.207947 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 20:13:36 crc kubenswrapper[4744]: I1205 20:13:36.210353 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 20:13:36 crc kubenswrapper[4744]: I1205 20:13:36.309972 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/965bad0f-5d6e-424b-a991-e08173625864-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"965bad0f-5d6e-424b-a991-e08173625864\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:13:36 crc kubenswrapper[4744]: I1205 20:13:36.310125 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/965bad0f-5d6e-424b-a991-e08173625864-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"965bad0f-5d6e-424b-a991-e08173625864\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:13:36 crc kubenswrapper[4744]: I1205 20:13:36.411627 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/965bad0f-5d6e-424b-a991-e08173625864-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"965bad0f-5d6e-424b-a991-e08173625864\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:13:36 crc kubenswrapper[4744]: I1205 20:13:36.411686 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/965bad0f-5d6e-424b-a991-e08173625864-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"965bad0f-5d6e-424b-a991-e08173625864\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:13:36 crc kubenswrapper[4744]: I1205 20:13:36.411782 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/965bad0f-5d6e-424b-a991-e08173625864-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"965bad0f-5d6e-424b-a991-e08173625864\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:13:36 crc kubenswrapper[4744]: I1205 20:13:36.452428 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/965bad0f-5d6e-424b-a991-e08173625864-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"965bad0f-5d6e-424b-a991-e08173625864\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:13:36 crc kubenswrapper[4744]: I1205 20:13:36.526604 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:13:36 crc kubenswrapper[4744]: E1205 20:13:36.974272 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-dd7w8" podUID="649cba80-0f59-449e-8a48-fbb1b4d373e3" Dec 05 20:13:37 crc kubenswrapper[4744]: E1205 20:13:37.072831 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 20:13:37 crc kubenswrapper[4744]: E1205 20:13:37.073032 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6clxq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-s2ptt_openshift-marketplace(7b0787d1-231e-453f-8f0a-09804298f1db): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:13:37 crc kubenswrapper[4744]: E1205 20:13:37.074259 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-s2ptt" podUID="7b0787d1-231e-453f-8f0a-09804298f1db" Dec 05 20:13:38 crc kubenswrapper[4744]: E1205 20:13:38.580549 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-s2ptt" podUID="7b0787d1-231e-453f-8f0a-09804298f1db" Dec 05 20:13:38 crc kubenswrapper[4744]: E1205 20:13:38.705228 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 20:13:38 crc kubenswrapper[4744]: E1205 20:13:38.705469 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-849np,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5x6x6_openshift-marketplace(2db367c1-8f1b-4096-9f23-5a3d14d3980f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:13:38 crc kubenswrapper[4744]: E1205 20:13:38.706711 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5x6x6" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" Dec 05 20:13:38 crc kubenswrapper[4744]: E1205 20:13:38.728708 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 20:13:38 crc kubenswrapper[4744]: E1205 20:13:38.728918 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bd8v4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7fvql_openshift-marketplace(b57bf7af-b1cf-4cd9-b431-db0540c6ffc4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:13:38 crc kubenswrapper[4744]: E1205 20:13:38.730416 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7fvql" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" Dec 05 20:13:38 crc kubenswrapper[4744]: E1205 20:13:38.755630 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 20:13:38 crc kubenswrapper[4744]: E1205 20:13:38.755803 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7txn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9sq9s_openshift-marketplace(f76f1c47-c74d-46cb-ad16-db7392a47a9b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:13:38 crc kubenswrapper[4744]: E1205 20:13:38.757043 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9sq9s" podUID="f76f1c47-c74d-46cb-ad16-db7392a47a9b" Dec 05 20:13:39 crc kubenswrapper[4744]: E1205 20:13:39.695101 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9sq9s" podUID="f76f1c47-c74d-46cb-ad16-db7392a47a9b" Dec 05 20:13:39 crc kubenswrapper[4744]: E1205 20:13:39.695125 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7fvql" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" Dec 05 20:13:39 crc kubenswrapper[4744]: E1205 20:13:39.695372 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5x6x6" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" Dec 05 20:13:39 crc kubenswrapper[4744]: E1205 20:13:39.797126 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 20:13:39 crc kubenswrapper[4744]: E1205 20:13:39.797580 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mgv2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hpp25_openshift-marketplace(b06f2a4f-7424-477d-b47a-9d71ce3cdd21): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:13:39 crc kubenswrapper[4744]: E1205 20:13:39.803447 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hpp25" podUID="b06f2a4f-7424-477d-b47a-9d71ce3cdd21" Dec 05 20:13:39 crc kubenswrapper[4744]: E1205 20:13:39.803512 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 20:13:39 crc kubenswrapper[4744]: E1205 20:13:39.803654 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5dvtx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-tclr2_openshift-marketplace(f07b8700-0120-4aa2-bd07-8a6f06d8dbf8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:13:39 crc kubenswrapper[4744]: E1205 20:13:39.805323 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-tclr2" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" Dec 05 20:13:39 crc kubenswrapper[4744]: E1205 20:13:39.838353 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 20:13:39 crc kubenswrapper[4744]: E1205 20:13:39.838536 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pmf9c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2n2x7_openshift-marketplace(ef4860d5-e9ab-4e8e-8b12-b5f004ea40db): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:13:39 crc kubenswrapper[4744]: E1205 20:13:39.839962 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2n2x7" podUID="ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" Dec 05 20:13:39 crc kubenswrapper[4744]: I1205 20:13:39.844043 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-h777m_476c0833-0a8f-4824-a7fe-6f28aada483b/cluster-samples-operator/1.log" Dec 05 20:13:39 crc kubenswrapper[4744]: E1205 20:13:39.847390 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hpp25" podUID="b06f2a4f-7424-477d-b47a-9d71ce3cdd21" Dec 05 20:13:39 crc kubenswrapper[4744]: E1205 20:13:39.847723 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-tclr2" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" Dec 05 20:13:39 crc kubenswrapper[4744]: I1205 20:13:39.949400 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cgjbb"] Dec 05 20:13:39 crc kubenswrapper[4744]: I1205 20:13:39.975384 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 20:13:40 crc kubenswrapper[4744]: I1205 20:13:40.856869 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" event={"ID":"9d0c84c8-b581-47ce-8cb8-956d3ef79238","Type":"ContainerStarted","Data":"fdc708d4ddfb54b8edc9768fc2bad5c41253fcb0549e3c8e0ca577befe5a1684"} Dec 05 20:13:40 crc kubenswrapper[4744]: I1205 20:13:40.857681 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" event={"ID":"9d0c84c8-b581-47ce-8cb8-956d3ef79238","Type":"ContainerStarted","Data":"dc39a90891c6fc63af6e2b6f2923c16e6d98693469d3d11969c372977f7da0a3"} Dec 05 20:13:40 crc kubenswrapper[4744]: I1205 20:13:40.857709 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cgjbb" event={"ID":"9d0c84c8-b581-47ce-8cb8-956d3ef79238","Type":"ContainerStarted","Data":"5ba73d842140f67381d47fbe17ed18882cfbb292860a12a339ac8b1aec62ddac"} Dec 05 20:13:40 crc kubenswrapper[4744]: I1205 20:13:40.859755 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-h777m_476c0833-0a8f-4824-a7fe-6f28aada483b/cluster-samples-operator/1.log" Dec 05 20:13:40 crc kubenswrapper[4744]: I1205 20:13:40.860591 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h777m" event={"ID":"476c0833-0a8f-4824-a7fe-6f28aada483b","Type":"ContainerStarted","Data":"3fef4ddee91e1e666ef1b7e2825cf75005ff6a93982511f15809c3dfbdc850ef"} Dec 05 20:13:40 crc kubenswrapper[4744]: I1205 20:13:40.863904 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"965bad0f-5d6e-424b-a991-e08173625864","Type":"ContainerStarted","Data":"305d50a03e56f038aa6226af79ddc28646e1422250c412455b3ebd45e5252bc0"} Dec 05 20:13:40 crc kubenswrapper[4744]: I1205 20:13:40.863939 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"965bad0f-5d6e-424b-a991-e08173625864","Type":"ContainerStarted","Data":"a1d71b2b56d57132b36adc8e6f483372eb5f0b88edc7b4d50fc06e44c84262ac"} Dec 05 20:13:40 crc kubenswrapper[4744]: E1205 20:13:40.864631 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2n2x7" podUID="ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" Dec 05 20:13:40 crc kubenswrapper[4744]: I1205 20:13:40.879085 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cgjbb" podStartSLOduration=170.879060814 podStartE2EDuration="2m50.879060814s" podCreationTimestamp="2025-12-05 20:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:40.872362551 +0000 UTC m=+191.102173979" watchObservedRunningTime="2025-12-05 20:13:40.879060814 +0000 UTC m=+191.108872182" Dec 05 20:13:40 crc kubenswrapper[4744]: I1205 20:13:40.898764 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=4.898739869 podStartE2EDuration="4.898739869s" podCreationTimestamp="2025-12-05 20:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:40.893836814 +0000 UTC m=+191.123648192" watchObservedRunningTime="2025-12-05 20:13:40.898739869 +0000 UTC m=+191.128551247" Dec 05 20:13:41 crc kubenswrapper[4744]: I1205 20:13:41.868316 4744 generic.go:334] "Generic (PLEG): container finished" podID="965bad0f-5d6e-424b-a991-e08173625864" containerID="305d50a03e56f038aa6226af79ddc28646e1422250c412455b3ebd45e5252bc0" exitCode=0 Dec 05 20:13:41 crc kubenswrapper[4744]: I1205 20:13:41.868438 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"965bad0f-5d6e-424b-a991-e08173625864","Type":"ContainerDied","Data":"305d50a03e56f038aa6226af79ddc28646e1422250c412455b3ebd45e5252bc0"} Dec 05 20:13:42 crc kubenswrapper[4744]: I1205 20:13:42.003022 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 20:13:42 crc kubenswrapper[4744]: I1205 20:13:42.003713 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:13:42 crc kubenswrapper[4744]: I1205 20:13:42.012210 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 20:13:42 crc kubenswrapper[4744]: I1205 20:13:42.097272 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-var-lock\") pod \"installer-9-crc\" (UID: \"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:13:42 crc kubenswrapper[4744]: I1205 20:13:42.097339 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:13:42 crc kubenswrapper[4744]: I1205 20:13:42.097372 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-kube-api-access\") pod \"installer-9-crc\" (UID: \"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:13:42 crc kubenswrapper[4744]: I1205 20:13:42.198602 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:13:42 crc kubenswrapper[4744]: I1205 20:13:42.198651 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-kube-api-access\") pod \"installer-9-crc\" (UID: \"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:13:42 crc kubenswrapper[4744]: I1205 20:13:42.198711 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-var-lock\") pod \"installer-9-crc\" (UID: \"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:13:42 crc kubenswrapper[4744]: I1205 20:13:42.198773 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-var-lock\") pod \"installer-9-crc\" (UID: \"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:13:42 crc kubenswrapper[4744]: I1205 20:13:42.198808 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:13:42 crc kubenswrapper[4744]: I1205 20:13:42.230584 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-kube-api-access\") pod \"installer-9-crc\" (UID: \"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:13:42 crc kubenswrapper[4744]: I1205 20:13:42.335616 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:13:42 crc kubenswrapper[4744]: I1205 20:13:42.536511 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 20:13:42 crc kubenswrapper[4744]: W1205 20:13:42.549451 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podab4cb2ea_f897_4ccf_888a_5c1eca4e4c41.slice/crio-9e96bf17703cfccafdc1a091a8c2f74b45fb04e386de16dc811595008f1c95ac WatchSource:0}: Error finding container 9e96bf17703cfccafdc1a091a8c2f74b45fb04e386de16dc811595008f1c95ac: Status 404 returned error can't find the container with id 9e96bf17703cfccafdc1a091a8c2f74b45fb04e386de16dc811595008f1c95ac Dec 05 20:13:42 crc kubenswrapper[4744]: I1205 20:13:42.879434 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41","Type":"ContainerStarted","Data":"9e96bf17703cfccafdc1a091a8c2f74b45fb04e386de16dc811595008f1c95ac"} Dec 05 20:13:43 crc kubenswrapper[4744]: I1205 20:13:43.088538 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:13:43 crc kubenswrapper[4744]: I1205 20:13:43.214641 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/965bad0f-5d6e-424b-a991-e08173625864-kubelet-dir\") pod \"965bad0f-5d6e-424b-a991-e08173625864\" (UID: \"965bad0f-5d6e-424b-a991-e08173625864\") " Dec 05 20:13:43 crc kubenswrapper[4744]: I1205 20:13:43.214772 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/965bad0f-5d6e-424b-a991-e08173625864-kube-api-access\") pod \"965bad0f-5d6e-424b-a991-e08173625864\" (UID: \"965bad0f-5d6e-424b-a991-e08173625864\") " Dec 05 20:13:43 crc kubenswrapper[4744]: I1205 20:13:43.214924 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/965bad0f-5d6e-424b-a991-e08173625864-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "965bad0f-5d6e-424b-a991-e08173625864" (UID: "965bad0f-5d6e-424b-a991-e08173625864"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:13:43 crc kubenswrapper[4744]: I1205 20:13:43.215188 4744 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/965bad0f-5d6e-424b-a991-e08173625864-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:13:43 crc kubenswrapper[4744]: I1205 20:13:43.222587 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/965bad0f-5d6e-424b-a991-e08173625864-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "965bad0f-5d6e-424b-a991-e08173625864" (UID: "965bad0f-5d6e-424b-a991-e08173625864"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:13:43 crc kubenswrapper[4744]: I1205 20:13:43.316092 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/965bad0f-5d6e-424b-a991-e08173625864-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:13:43 crc kubenswrapper[4744]: I1205 20:13:43.887855 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"965bad0f-5d6e-424b-a991-e08173625864","Type":"ContainerDied","Data":"a1d71b2b56d57132b36adc8e6f483372eb5f0b88edc7b4d50fc06e44c84262ac"} Dec 05 20:13:43 crc kubenswrapper[4744]: I1205 20:13:43.887919 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1d71b2b56d57132b36adc8e6f483372eb5f0b88edc7b4d50fc06e44c84262ac" Dec 05 20:13:43 crc kubenswrapper[4744]: I1205 20:13:43.887870 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 20:13:43 crc kubenswrapper[4744]: I1205 20:13:43.889422 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41","Type":"ContainerStarted","Data":"669fc7d991370efcd0db4a16a35177c646d47f053f84b4bc658d24788e2ffac7"} Dec 05 20:13:43 crc kubenswrapper[4744]: I1205 20:13:43.914608 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.914585842 podStartE2EDuration="2.914585842s" podCreationTimestamp="2025-12-05 20:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:13:43.91138861 +0000 UTC m=+194.141200008" watchObservedRunningTime="2025-12-05 20:13:43.914585842 +0000 UTC m=+194.144397240" Dec 05 20:13:49 crc kubenswrapper[4744]: I1205 20:13:49.806946 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:13:49 crc kubenswrapper[4744]: I1205 20:13:49.807599 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:13:49 crc kubenswrapper[4744]: I1205 20:13:49.922148 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd7w8" event={"ID":"649cba80-0f59-449e-8a48-fbb1b4d373e3","Type":"ContainerStarted","Data":"76319e80e00bcd06b7a5f6089cf069ca74f9b1faa685462aa8d0dd5e72a306b2"} Dec 05 20:13:50 crc kubenswrapper[4744]: I1205 20:13:50.942016 4744 generic.go:334] "Generic (PLEG): container finished" podID="649cba80-0f59-449e-8a48-fbb1b4d373e3" containerID="76319e80e00bcd06b7a5f6089cf069ca74f9b1faa685462aa8d0dd5e72a306b2" exitCode=0 Dec 05 20:13:50 crc kubenswrapper[4744]: I1205 20:13:50.942059 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd7w8" event={"ID":"649cba80-0f59-449e-8a48-fbb1b4d373e3","Type":"ContainerDied","Data":"76319e80e00bcd06b7a5f6089cf069ca74f9b1faa685462aa8d0dd5e72a306b2"} Dec 05 20:13:51 crc kubenswrapper[4744]: I1205 20:13:51.948314 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd7w8" event={"ID":"649cba80-0f59-449e-8a48-fbb1b4d373e3","Type":"ContainerStarted","Data":"bf505c016b38f1c6e5b022f3ba3ec5026e1ef6222a697c831562b1819dc71519"} Dec 05 20:13:51 crc kubenswrapper[4744]: I1205 20:13:51.951966 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpp25" event={"ID":"b06f2a4f-7424-477d-b47a-9d71ce3cdd21","Type":"ContainerStarted","Data":"449156bf1ef8149b8ede713a2839c05e4d49d34e2bcf31d29d3ed96d3e5b8949"} Dec 05 20:13:51 crc kubenswrapper[4744]: I1205 20:13:51.953489 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2ptt" event={"ID":"7b0787d1-231e-453f-8f0a-09804298f1db","Type":"ContainerStarted","Data":"fe69676b01229c33b8f136d2ea6251be8bb07a3eec9696af267bdf4c8b5c9447"} Dec 05 20:13:51 crc kubenswrapper[4744]: I1205 20:13:51.995999 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dd7w8" podStartSLOduration=2.817384298 podStartE2EDuration="54.995976779s" podCreationTimestamp="2025-12-05 20:12:57 +0000 UTC" firstStartedPulling="2025-12-05 20:12:59.368646431 +0000 UTC m=+149.598457799" lastFinishedPulling="2025-12-05 20:13:51.547238912 +0000 UTC m=+201.777050280" observedRunningTime="2025-12-05 20:13:51.979674639 +0000 UTC m=+202.209486017" watchObservedRunningTime="2025-12-05 20:13:51.995976779 +0000 UTC m=+202.225788147" Dec 05 20:13:52 crc kubenswrapper[4744]: E1205 20:13:52.047726 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b0787d1_231e_453f_8f0a_09804298f1db.slice/crio-fe69676b01229c33b8f136d2ea6251be8bb07a3eec9696af267bdf4c8b5c9447.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb06f2a4f_7424_477d_b47a_9d71ce3cdd21.slice/crio-conmon-449156bf1ef8149b8ede713a2839c05e4d49d34e2bcf31d29d3ed96d3e5b8949.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb06f2a4f_7424_477d_b47a_9d71ce3cdd21.slice/crio-449156bf1ef8149b8ede713a2839c05e4d49d34e2bcf31d29d3ed96d3e5b8949.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b0787d1_231e_453f_8f0a_09804298f1db.slice/crio-conmon-fe69676b01229c33b8f136d2ea6251be8bb07a3eec9696af267bdf4c8b5c9447.scope\": RecentStats: unable to find data in memory cache]" Dec 05 20:13:52 crc kubenswrapper[4744]: I1205 20:13:52.961755 4744 generic.go:334] "Generic (PLEG): container finished" podID="7b0787d1-231e-453f-8f0a-09804298f1db" containerID="fe69676b01229c33b8f136d2ea6251be8bb07a3eec9696af267bdf4c8b5c9447" exitCode=0 Dec 05 20:13:52 crc kubenswrapper[4744]: I1205 20:13:52.961853 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2ptt" event={"ID":"7b0787d1-231e-453f-8f0a-09804298f1db","Type":"ContainerDied","Data":"fe69676b01229c33b8f136d2ea6251be8bb07a3eec9696af267bdf4c8b5c9447"} Dec 05 20:13:52 crc kubenswrapper[4744]: I1205 20:13:52.963778 4744 generic.go:334] "Generic (PLEG): container finished" podID="f76f1c47-c74d-46cb-ad16-db7392a47a9b" containerID="410f0bbc23e0e8be37f4332a6320d8d973060e0c7574c9c8885e94522b5de5b7" exitCode=0 Dec 05 20:13:52 crc kubenswrapper[4744]: I1205 20:13:52.963845 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sq9s" event={"ID":"f76f1c47-c74d-46cb-ad16-db7392a47a9b","Type":"ContainerDied","Data":"410f0bbc23e0e8be37f4332a6320d8d973060e0c7574c9c8885e94522b5de5b7"} Dec 05 20:13:52 crc kubenswrapper[4744]: I1205 20:13:52.967304 4744 generic.go:334] "Generic (PLEG): container finished" podID="b06f2a4f-7424-477d-b47a-9d71ce3cdd21" containerID="449156bf1ef8149b8ede713a2839c05e4d49d34e2bcf31d29d3ed96d3e5b8949" exitCode=0 Dec 05 20:13:52 crc kubenswrapper[4744]: I1205 20:13:52.967337 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpp25" event={"ID":"b06f2a4f-7424-477d-b47a-9d71ce3cdd21","Type":"ContainerDied","Data":"449156bf1ef8149b8ede713a2839c05e4d49d34e2bcf31d29d3ed96d3e5b8949"} Dec 05 20:13:54 crc kubenswrapper[4744]: I1205 20:13:54.980078 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x6x6" event={"ID":"2db367c1-8f1b-4096-9f23-5a3d14d3980f","Type":"ContainerStarted","Data":"d85df125698bde150bbf620acff02e954eb9629fe0f759a7c05c07b36d26cd11"} Dec 05 20:13:54 crc kubenswrapper[4744]: I1205 20:13:54.982073 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2ptt" event={"ID":"7b0787d1-231e-453f-8f0a-09804298f1db","Type":"ContainerStarted","Data":"a057ecea973103f272a6c84cb37d7e5fa94849aeeaadac2427658ae78bcd36df"} Dec 05 20:13:54 crc kubenswrapper[4744]: I1205 20:13:54.984437 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sq9s" event={"ID":"f76f1c47-c74d-46cb-ad16-db7392a47a9b","Type":"ContainerStarted","Data":"2b77e22a6982778d23d7eae56b7085a1c12dfd134a958caa64c474628e19d0c5"} Dec 05 20:13:54 crc kubenswrapper[4744]: I1205 20:13:54.986275 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpp25" event={"ID":"b06f2a4f-7424-477d-b47a-9d71ce3cdd21","Type":"ContainerStarted","Data":"ff12dcfb0cc6a3ebb8ac55c3ae16504d819843a05373a7efe62912cf8e75fb03"} Dec 05 20:13:54 crc kubenswrapper[4744]: I1205 20:13:54.987740 4744 generic.go:334] "Generic (PLEG): container finished" podID="ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" containerID="c1a1e586a78a188a11593ae19667f75cf52b866ff02b9efc52c48240c19bea5c" exitCode=0 Dec 05 20:13:54 crc kubenswrapper[4744]: I1205 20:13:54.987797 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2n2x7" event={"ID":"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db","Type":"ContainerDied","Data":"c1a1e586a78a188a11593ae19667f75cf52b866ff02b9efc52c48240c19bea5c"} Dec 05 20:13:54 crc kubenswrapper[4744]: I1205 20:13:54.989233 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fvql" event={"ID":"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4","Type":"ContainerStarted","Data":"d7076a96891777dfe1b1f3f0dda9bfd096aac6098387aa30bccdef1f1ef3ba44"} Dec 05 20:13:55 crc kubenswrapper[4744]: I1205 20:13:55.050457 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s2ptt" podStartSLOduration=4.826347279 podStartE2EDuration="1m1.050436303s" podCreationTimestamp="2025-12-05 20:12:54 +0000 UTC" firstStartedPulling="2025-12-05 20:12:57.148746751 +0000 UTC m=+147.378558119" lastFinishedPulling="2025-12-05 20:13:53.372835775 +0000 UTC m=+203.602647143" observedRunningTime="2025-12-05 20:13:55.047262502 +0000 UTC m=+205.277073870" watchObservedRunningTime="2025-12-05 20:13:55.050436303 +0000 UTC m=+205.280247671" Dec 05 20:13:55 crc kubenswrapper[4744]: I1205 20:13:55.053231 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:13:55 crc kubenswrapper[4744]: I1205 20:13:55.053277 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:13:55 crc kubenswrapper[4744]: I1205 20:13:55.064725 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9sq9s" podStartSLOduration=4.80419947 podStartE2EDuration="1m1.064705921s" podCreationTimestamp="2025-12-05 20:12:54 +0000 UTC" firstStartedPulling="2025-12-05 20:12:57.131810775 +0000 UTC m=+147.361622133" lastFinishedPulling="2025-12-05 20:13:53.392317216 +0000 UTC m=+203.622128584" observedRunningTime="2025-12-05 20:13:55.063927821 +0000 UTC m=+205.293739189" watchObservedRunningTime="2025-12-05 20:13:55.064705921 +0000 UTC m=+205.294517289" Dec 05 20:13:55 crc kubenswrapper[4744]: I1205 20:13:55.078362 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hpp25" podStartSLOduration=4.077047876 podStartE2EDuration="1m1.078348951s" podCreationTimestamp="2025-12-05 20:12:54 +0000 UTC" firstStartedPulling="2025-12-05 20:12:57.145396195 +0000 UTC m=+147.375207563" lastFinishedPulling="2025-12-05 20:13:54.14669727 +0000 UTC m=+204.376508638" observedRunningTime="2025-12-05 20:13:55.075498688 +0000 UTC m=+205.305310056" watchObservedRunningTime="2025-12-05 20:13:55.078348951 +0000 UTC m=+205.308160319" Dec 05 20:13:55 crc kubenswrapper[4744]: I1205 20:13:55.431116 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:13:55 crc kubenswrapper[4744]: I1205 20:13:55.431163 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:13:55 crc kubenswrapper[4744]: I1205 20:13:55.996369 4744 generic.go:334] "Generic (PLEG): container finished" podID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" containerID="d7076a96891777dfe1b1f3f0dda9bfd096aac6098387aa30bccdef1f1ef3ba44" exitCode=0 Dec 05 20:13:55 crc kubenswrapper[4744]: I1205 20:13:55.996446 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fvql" event={"ID":"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4","Type":"ContainerDied","Data":"d7076a96891777dfe1b1f3f0dda9bfd096aac6098387aa30bccdef1f1ef3ba44"} Dec 05 20:13:55 crc kubenswrapper[4744]: I1205 20:13:55.999322 4744 generic.go:334] "Generic (PLEG): container finished" podID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" containerID="d85df125698bde150bbf620acff02e954eb9629fe0f759a7c05c07b36d26cd11" exitCode=0 Dec 05 20:13:55 crc kubenswrapper[4744]: I1205 20:13:55.999404 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x6x6" event={"ID":"2db367c1-8f1b-4096-9f23-5a3d14d3980f","Type":"ContainerDied","Data":"d85df125698bde150bbf620acff02e954eb9629fe0f759a7c05c07b36d26cd11"} Dec 05 20:13:56 crc kubenswrapper[4744]: I1205 20:13:56.107277 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hpp25" podUID="b06f2a4f-7424-477d-b47a-9d71ce3cdd21" containerName="registry-server" probeResult="failure" output=< Dec 05 20:13:56 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Dec 05 20:13:56 crc kubenswrapper[4744]: > Dec 05 20:13:56 crc kubenswrapper[4744]: I1205 20:13:56.462522 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-s2ptt" podUID="7b0787d1-231e-453f-8f0a-09804298f1db" containerName="registry-server" probeResult="failure" output=< Dec 05 20:13:56 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Dec 05 20:13:56 crc kubenswrapper[4744]: > Dec 05 20:13:57 crc kubenswrapper[4744]: I1205 20:13:57.006319 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2n2x7" event={"ID":"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db","Type":"ContainerStarted","Data":"08c513cd6bd4a944ca7ea50a73eaddb1042316c699ee8a1ccc675f95af166427"} Dec 05 20:13:58 crc kubenswrapper[4744]: I1205 20:13:58.091903 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:13:58 crc kubenswrapper[4744]: I1205 20:13:58.091939 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:13:58 crc kubenswrapper[4744]: I1205 20:13:58.280499 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:13:59 crc kubenswrapper[4744]: I1205 20:13:59.035588 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2n2x7" podStartSLOduration=5.710895666 podStartE2EDuration="1m3.035573685s" podCreationTimestamp="2025-12-05 20:12:56 +0000 UTC" firstStartedPulling="2025-12-05 20:12:58.342560422 +0000 UTC m=+148.572371790" lastFinishedPulling="2025-12-05 20:13:55.667238441 +0000 UTC m=+205.897049809" observedRunningTime="2025-12-05 20:13:59.032610589 +0000 UTC m=+209.262421957" watchObservedRunningTime="2025-12-05 20:13:59.035573685 +0000 UTC m=+209.265385053" Dec 05 20:13:59 crc kubenswrapper[4744]: I1205 20:13:59.050873 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:14:04 crc kubenswrapper[4744]: I1205 20:14:04.862518 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:14:04 crc kubenswrapper[4744]: I1205 20:14:04.863110 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:14:04 crc kubenswrapper[4744]: I1205 20:14:04.915335 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:14:05 crc kubenswrapper[4744]: I1205 20:14:05.102718 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:14:05 crc kubenswrapper[4744]: I1205 20:14:05.125216 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:14:05 crc kubenswrapper[4744]: I1205 20:14:05.155373 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:14:05 crc kubenswrapper[4744]: I1205 20:14:05.501914 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:14:05 crc kubenswrapper[4744]: I1205 20:14:05.566570 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:14:06 crc kubenswrapper[4744]: I1205 20:14:06.961333 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2ptt"] Dec 05 20:14:07 crc kubenswrapper[4744]: I1205 20:14:07.070129 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s2ptt" podUID="7b0787d1-231e-453f-8f0a-09804298f1db" containerName="registry-server" containerID="cri-o://a057ecea973103f272a6c84cb37d7e5fa94849aeeaadac2427658ae78bcd36df" gracePeriod=2 Dec 05 20:14:07 crc kubenswrapper[4744]: I1205 20:14:07.225107 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:14:07 crc kubenswrapper[4744]: I1205 20:14:07.225715 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:14:07 crc kubenswrapper[4744]: I1205 20:14:07.293728 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:14:07 crc kubenswrapper[4744]: I1205 20:14:07.552336 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hpp25"] Dec 05 20:14:07 crc kubenswrapper[4744]: I1205 20:14:07.552796 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hpp25" podUID="b06f2a4f-7424-477d-b47a-9d71ce3cdd21" containerName="registry-server" containerID="cri-o://ff12dcfb0cc6a3ebb8ac55c3ae16504d819843a05373a7efe62912cf8e75fb03" gracePeriod=2 Dec 05 20:14:08 crc kubenswrapper[4744]: I1205 20:14:08.082501 4744 generic.go:334] "Generic (PLEG): container finished" podID="7b0787d1-231e-453f-8f0a-09804298f1db" containerID="a057ecea973103f272a6c84cb37d7e5fa94849aeeaadac2427658ae78bcd36df" exitCode=0 Dec 05 20:14:08 crc kubenswrapper[4744]: I1205 20:14:08.093879 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2ptt" event={"ID":"7b0787d1-231e-453f-8f0a-09804298f1db","Type":"ContainerDied","Data":"a057ecea973103f272a6c84cb37d7e5fa94849aeeaadac2427658ae78bcd36df"} Dec 05 20:14:08 crc kubenswrapper[4744]: I1205 20:14:08.150660 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:14:09 crc kubenswrapper[4744]: I1205 20:14:09.092790 4744 generic.go:334] "Generic (PLEG): container finished" podID="b06f2a4f-7424-477d-b47a-9d71ce3cdd21" containerID="ff12dcfb0cc6a3ebb8ac55c3ae16504d819843a05373a7efe62912cf8e75fb03" exitCode=0 Dec 05 20:14:09 crc kubenswrapper[4744]: I1205 20:14:09.092845 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpp25" event={"ID":"b06f2a4f-7424-477d-b47a-9d71ce3cdd21","Type":"ContainerDied","Data":"ff12dcfb0cc6a3ebb8ac55c3ae16504d819843a05373a7efe62912cf8e75fb03"} Dec 05 20:14:09 crc kubenswrapper[4744]: I1205 20:14:09.274426 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:14:09 crc kubenswrapper[4744]: I1205 20:14:09.386393 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b0787d1-231e-453f-8f0a-09804298f1db-catalog-content\") pod \"7b0787d1-231e-453f-8f0a-09804298f1db\" (UID: \"7b0787d1-231e-453f-8f0a-09804298f1db\") " Dec 05 20:14:09 crc kubenswrapper[4744]: I1205 20:14:09.386514 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b0787d1-231e-453f-8f0a-09804298f1db-utilities\") pod \"7b0787d1-231e-453f-8f0a-09804298f1db\" (UID: \"7b0787d1-231e-453f-8f0a-09804298f1db\") " Dec 05 20:14:09 crc kubenswrapper[4744]: I1205 20:14:09.386724 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6clxq\" (UniqueName: \"kubernetes.io/projected/7b0787d1-231e-453f-8f0a-09804298f1db-kube-api-access-6clxq\") pod \"7b0787d1-231e-453f-8f0a-09804298f1db\" (UID: \"7b0787d1-231e-453f-8f0a-09804298f1db\") " Dec 05 20:14:09 crc kubenswrapper[4744]: I1205 20:14:09.387956 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b0787d1-231e-453f-8f0a-09804298f1db-utilities" (OuterVolumeSpecName: "utilities") pod "7b0787d1-231e-453f-8f0a-09804298f1db" (UID: "7b0787d1-231e-453f-8f0a-09804298f1db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:14:09 crc kubenswrapper[4744]: I1205 20:14:09.395079 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0787d1-231e-453f-8f0a-09804298f1db-kube-api-access-6clxq" (OuterVolumeSpecName: "kube-api-access-6clxq") pod "7b0787d1-231e-453f-8f0a-09804298f1db" (UID: "7b0787d1-231e-453f-8f0a-09804298f1db"). InnerVolumeSpecName "kube-api-access-6clxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:14:09 crc kubenswrapper[4744]: I1205 20:14:09.457757 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b0787d1-231e-453f-8f0a-09804298f1db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b0787d1-231e-453f-8f0a-09804298f1db" (UID: "7b0787d1-231e-453f-8f0a-09804298f1db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:14:09 crc kubenswrapper[4744]: I1205 20:14:09.488800 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b0787d1-231e-453f-8f0a-09804298f1db-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:09 crc kubenswrapper[4744]: I1205 20:14:09.488850 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b0787d1-231e-453f-8f0a-09804298f1db-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:09 crc kubenswrapper[4744]: I1205 20:14:09.488864 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6clxq\" (UniqueName: \"kubernetes.io/projected/7b0787d1-231e-453f-8f0a-09804298f1db-kube-api-access-6clxq\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:09 crc kubenswrapper[4744]: I1205 20:14:09.951260 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2n2x7"] Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.102225 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2ptt" event={"ID":"7b0787d1-231e-453f-8f0a-09804298f1db","Type":"ContainerDied","Data":"57bf9e20bc3bcf918fc084984e9e5a570e87431ae50fac516dc81976976c6153"} Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.102334 4744 scope.go:117] "RemoveContainer" containerID="a057ecea973103f272a6c84cb37d7e5fa94849aeeaadac2427658ae78bcd36df" Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.102387 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2ptt" Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.133806 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2ptt"] Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.140281 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s2ptt"] Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.143134 4744 scope.go:117] "RemoveContainer" containerID="fe69676b01229c33b8f136d2ea6251be8bb07a3eec9696af267bdf4c8b5c9447" Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.171349 4744 scope.go:117] "RemoveContainer" containerID="8329f6aba3faec10162b6e9ada8e24903a80b16acbe7950c770240e8fbfac565" Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.436249 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.539584 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-catalog-content\") pod \"b06f2a4f-7424-477d-b47a-9d71ce3cdd21\" (UID: \"b06f2a4f-7424-477d-b47a-9d71ce3cdd21\") " Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.539665 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgv2n\" (UniqueName: \"kubernetes.io/projected/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-kube-api-access-mgv2n\") pod \"b06f2a4f-7424-477d-b47a-9d71ce3cdd21\" (UID: \"b06f2a4f-7424-477d-b47a-9d71ce3cdd21\") " Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.539733 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-utilities\") pod \"b06f2a4f-7424-477d-b47a-9d71ce3cdd21\" (UID: \"b06f2a4f-7424-477d-b47a-9d71ce3cdd21\") " Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.541699 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-utilities" (OuterVolumeSpecName: "utilities") pod "b06f2a4f-7424-477d-b47a-9d71ce3cdd21" (UID: "b06f2a4f-7424-477d-b47a-9d71ce3cdd21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.544716 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-kube-api-access-mgv2n" (OuterVolumeSpecName: "kube-api-access-mgv2n") pod "b06f2a4f-7424-477d-b47a-9d71ce3cdd21" (UID: "b06f2a4f-7424-477d-b47a-9d71ce3cdd21"). InnerVolumeSpecName "kube-api-access-mgv2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.594638 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b06f2a4f-7424-477d-b47a-9d71ce3cdd21" (UID: "b06f2a4f-7424-477d-b47a-9d71ce3cdd21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.641742 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.641791 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:10 crc kubenswrapper[4744]: I1205 20:14:10.641812 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgv2n\" (UniqueName: \"kubernetes.io/projected/b06f2a4f-7424-477d-b47a-9d71ce3cdd21-kube-api-access-mgv2n\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:11 crc kubenswrapper[4744]: I1205 20:14:11.114953 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpp25" event={"ID":"b06f2a4f-7424-477d-b47a-9d71ce3cdd21","Type":"ContainerDied","Data":"9bbb64196873ddc3251234dbe1fa096bd318bf80902ec6d9bd5e5b426d0c8ef2"} Dec 05 20:14:11 crc kubenswrapper[4744]: I1205 20:14:11.114983 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpp25" Dec 05 20:14:11 crc kubenswrapper[4744]: I1205 20:14:11.115034 4744 scope.go:117] "RemoveContainer" containerID="ff12dcfb0cc6a3ebb8ac55c3ae16504d819843a05373a7efe62912cf8e75fb03" Dec 05 20:14:11 crc kubenswrapper[4744]: I1205 20:14:11.115172 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2n2x7" podUID="ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" containerName="registry-server" containerID="cri-o://08c513cd6bd4a944ca7ea50a73eaddb1042316c699ee8a1ccc675f95af166427" gracePeriod=2 Dec 05 20:14:11 crc kubenswrapper[4744]: I1205 20:14:11.145754 4744 scope.go:117] "RemoveContainer" containerID="449156bf1ef8149b8ede713a2839c05e4d49d34e2bcf31d29d3ed96d3e5b8949" Dec 05 20:14:11 crc kubenswrapper[4744]: I1205 20:14:11.164412 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hpp25"] Dec 05 20:14:11 crc kubenswrapper[4744]: I1205 20:14:11.185561 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hpp25"] Dec 05 20:14:11 crc kubenswrapper[4744]: I1205 20:14:11.188071 4744 scope.go:117] "RemoveContainer" containerID="6fba76e79ae3216a2b9fcd333c0b4659e7101bfe672c8f79e6e14149bb0d7118" Dec 05 20:14:12 crc kubenswrapper[4744]: I1205 20:14:12.092544 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b0787d1-231e-453f-8f0a-09804298f1db" path="/var/lib/kubelet/pods/7b0787d1-231e-453f-8f0a-09804298f1db/volumes" Dec 05 20:14:12 crc kubenswrapper[4744]: I1205 20:14:12.093811 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b06f2a4f-7424-477d-b47a-9d71ce3cdd21" path="/var/lib/kubelet/pods/b06f2a4f-7424-477d-b47a-9d71ce3cdd21/volumes" Dec 05 20:14:12 crc kubenswrapper[4744]: I1205 20:14:12.636686 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:14:12 crc kubenswrapper[4744]: I1205 20:14:12.773137 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-utilities\") pod \"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db\" (UID: \"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db\") " Dec 05 20:14:12 crc kubenswrapper[4744]: I1205 20:14:12.773381 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-catalog-content\") pod \"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db\" (UID: \"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db\") " Dec 05 20:14:12 crc kubenswrapper[4744]: I1205 20:14:12.773559 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmf9c\" (UniqueName: \"kubernetes.io/projected/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-kube-api-access-pmf9c\") pod \"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db\" (UID: \"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db\") " Dec 05 20:14:12 crc kubenswrapper[4744]: I1205 20:14:12.775234 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-utilities" (OuterVolumeSpecName: "utilities") pod "ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" (UID: "ef4860d5-e9ab-4e8e-8b12-b5f004ea40db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:14:12 crc kubenswrapper[4744]: I1205 20:14:12.786157 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-kube-api-access-pmf9c" (OuterVolumeSpecName: "kube-api-access-pmf9c") pod "ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" (UID: "ef4860d5-e9ab-4e8e-8b12-b5f004ea40db"). InnerVolumeSpecName "kube-api-access-pmf9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:14:12 crc kubenswrapper[4744]: I1205 20:14:12.790144 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" (UID: "ef4860d5-e9ab-4e8e-8b12-b5f004ea40db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:14:12 crc kubenswrapper[4744]: I1205 20:14:12.875047 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmf9c\" (UniqueName: \"kubernetes.io/projected/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-kube-api-access-pmf9c\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:12 crc kubenswrapper[4744]: I1205 20:14:12.875078 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:12 crc kubenswrapper[4744]: I1205 20:14:12.875087 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.130234 4744 generic.go:334] "Generic (PLEG): container finished" podID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" containerID="1afa242742fbfd72482f43c0dc0117d1b03043f048947c003c8631bd81e93b15" exitCode=0 Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.130328 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tclr2" event={"ID":"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8","Type":"ContainerDied","Data":"1afa242742fbfd72482f43c0dc0117d1b03043f048947c003c8631bd81e93b15"} Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.134412 4744 generic.go:334] "Generic (PLEG): container finished" podID="ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" containerID="08c513cd6bd4a944ca7ea50a73eaddb1042316c699ee8a1ccc675f95af166427" exitCode=0 Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.134497 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2n2x7" Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.135303 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2n2x7" event={"ID":"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db","Type":"ContainerDied","Data":"08c513cd6bd4a944ca7ea50a73eaddb1042316c699ee8a1ccc675f95af166427"} Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.135349 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2n2x7" event={"ID":"ef4860d5-e9ab-4e8e-8b12-b5f004ea40db","Type":"ContainerDied","Data":"e024676422bf2c0764ac1923c6ce7ab9d8dd57030dd290474a75fa5789228a49"} Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.135371 4744 scope.go:117] "RemoveContainer" containerID="08c513cd6bd4a944ca7ea50a73eaddb1042316c699ee8a1ccc675f95af166427" Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.143550 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fvql" event={"ID":"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4","Type":"ContainerStarted","Data":"9c72da9e9f9915bf9bdc0d19779b490a23f097f6ab26740670049faa9f00ed24"} Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.148501 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x6x6" event={"ID":"2db367c1-8f1b-4096-9f23-5a3d14d3980f","Type":"ContainerStarted","Data":"0cdd43d34cccaf99eea0536c9c5230dcc3370d69fee7816d9e404f2611f48af1"} Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.169332 4744 scope.go:117] "RemoveContainer" containerID="c1a1e586a78a188a11593ae19667f75cf52b866ff02b9efc52c48240c19bea5c" Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.196833 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5x6x6" podStartSLOduration=5.130623202 podStartE2EDuration="1m19.196805683s" podCreationTimestamp="2025-12-05 20:12:54 +0000 UTC" firstStartedPulling="2025-12-05 20:12:56.077681136 +0000 UTC m=+146.307492504" lastFinishedPulling="2025-12-05 20:14:10.143863607 +0000 UTC m=+220.373674985" observedRunningTime="2025-12-05 20:14:13.180391051 +0000 UTC m=+223.410202429" watchObservedRunningTime="2025-12-05 20:14:13.196805683 +0000 UTC m=+223.426617061" Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.198613 4744 scope.go:117] "RemoveContainer" containerID="37f3b648fa2a1567c53eba4da3d563526bd269ed87933facc699f9cbcd647b2e" Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.204054 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2n2x7"] Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.208452 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2n2x7"] Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.214016 4744 scope.go:117] "RemoveContainer" containerID="08c513cd6bd4a944ca7ea50a73eaddb1042316c699ee8a1ccc675f95af166427" Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.214918 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7fvql" podStartSLOduration=7.625985637 podStartE2EDuration="1m16.214904508s" podCreationTimestamp="2025-12-05 20:12:57 +0000 UTC" firstStartedPulling="2025-12-05 20:12:59.388567453 +0000 UTC m=+149.618378821" lastFinishedPulling="2025-12-05 20:14:07.977486284 +0000 UTC m=+218.207297692" observedRunningTime="2025-12-05 20:14:13.211020848 +0000 UTC m=+223.440832236" watchObservedRunningTime="2025-12-05 20:14:13.214904508 +0000 UTC m=+223.444715896" Dec 05 20:14:13 crc kubenswrapper[4744]: E1205 20:14:13.217424 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08c513cd6bd4a944ca7ea50a73eaddb1042316c699ee8a1ccc675f95af166427\": container with ID starting with 08c513cd6bd4a944ca7ea50a73eaddb1042316c699ee8a1ccc675f95af166427 not found: ID does not exist" containerID="08c513cd6bd4a944ca7ea50a73eaddb1042316c699ee8a1ccc675f95af166427" Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.217481 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08c513cd6bd4a944ca7ea50a73eaddb1042316c699ee8a1ccc675f95af166427"} err="failed to get container status \"08c513cd6bd4a944ca7ea50a73eaddb1042316c699ee8a1ccc675f95af166427\": rpc error: code = NotFound desc = could not find container \"08c513cd6bd4a944ca7ea50a73eaddb1042316c699ee8a1ccc675f95af166427\": container with ID starting with 08c513cd6bd4a944ca7ea50a73eaddb1042316c699ee8a1ccc675f95af166427 not found: ID does not exist" Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.217539 4744 scope.go:117] "RemoveContainer" containerID="c1a1e586a78a188a11593ae19667f75cf52b866ff02b9efc52c48240c19bea5c" Dec 05 20:14:13 crc kubenswrapper[4744]: E1205 20:14:13.217792 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a1e586a78a188a11593ae19667f75cf52b866ff02b9efc52c48240c19bea5c\": container with ID starting with c1a1e586a78a188a11593ae19667f75cf52b866ff02b9efc52c48240c19bea5c not found: ID does not exist" containerID="c1a1e586a78a188a11593ae19667f75cf52b866ff02b9efc52c48240c19bea5c" Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.217820 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a1e586a78a188a11593ae19667f75cf52b866ff02b9efc52c48240c19bea5c"} err="failed to get container status \"c1a1e586a78a188a11593ae19667f75cf52b866ff02b9efc52c48240c19bea5c\": rpc error: code = NotFound desc = could not find container \"c1a1e586a78a188a11593ae19667f75cf52b866ff02b9efc52c48240c19bea5c\": container with ID starting with c1a1e586a78a188a11593ae19667f75cf52b866ff02b9efc52c48240c19bea5c not found: ID does not exist" Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.217837 4744 scope.go:117] "RemoveContainer" containerID="37f3b648fa2a1567c53eba4da3d563526bd269ed87933facc699f9cbcd647b2e" Dec 05 20:14:13 crc kubenswrapper[4744]: E1205 20:14:13.218046 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37f3b648fa2a1567c53eba4da3d563526bd269ed87933facc699f9cbcd647b2e\": container with ID starting with 37f3b648fa2a1567c53eba4da3d563526bd269ed87933facc699f9cbcd647b2e not found: ID does not exist" containerID="37f3b648fa2a1567c53eba4da3d563526bd269ed87933facc699f9cbcd647b2e" Dec 05 20:14:13 crc kubenswrapper[4744]: I1205 20:14:13.218074 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37f3b648fa2a1567c53eba4da3d563526bd269ed87933facc699f9cbcd647b2e"} err="failed to get container status \"37f3b648fa2a1567c53eba4da3d563526bd269ed87933facc699f9cbcd647b2e\": rpc error: code = NotFound desc = could not find container \"37f3b648fa2a1567c53eba4da3d563526bd269ed87933facc699f9cbcd647b2e\": container with ID starting with 37f3b648fa2a1567c53eba4da3d563526bd269ed87933facc699f9cbcd647b2e not found: ID does not exist" Dec 05 20:14:14 crc kubenswrapper[4744]: I1205 20:14:14.108501 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" path="/var/lib/kubelet/pods/ef4860d5-e9ab-4e8e-8b12-b5f004ea40db/volumes" Dec 05 20:14:14 crc kubenswrapper[4744]: I1205 20:14:14.156336 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tclr2" event={"ID":"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8","Type":"ContainerStarted","Data":"4ffdd88e145c16831c1c98719ec8431b8edab4533405566124b3ee38817b828d"} Dec 05 20:14:14 crc kubenswrapper[4744]: I1205 20:14:14.175263 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tclr2" podStartSLOduration=2.859904101 podStartE2EDuration="1m18.175244267s" podCreationTimestamp="2025-12-05 20:12:56 +0000 UTC" firstStartedPulling="2025-12-05 20:12:58.280205608 +0000 UTC m=+148.510016976" lastFinishedPulling="2025-12-05 20:14:13.595545774 +0000 UTC m=+223.825357142" observedRunningTime="2025-12-05 20:14:14.172763233 +0000 UTC m=+224.402574611" watchObservedRunningTime="2025-12-05 20:14:14.175244267 +0000 UTC m=+224.405055635" Dec 05 20:14:14 crc kubenswrapper[4744]: I1205 20:14:14.628451 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:14:14 crc kubenswrapper[4744]: I1205 20:14:14.628502 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:14:14 crc kubenswrapper[4744]: I1205 20:14:14.667827 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:14:16 crc kubenswrapper[4744]: I1205 20:14:16.876272 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:14:16 crc kubenswrapper[4744]: I1205 20:14:16.876626 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:14:16 crc kubenswrapper[4744]: I1205 20:14:16.937681 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:14:17 crc kubenswrapper[4744]: I1205 20:14:17.297502 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dn5pv"] Dec 05 20:14:18 crc kubenswrapper[4744]: I1205 20:14:18.063744 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:14:18 crc kubenswrapper[4744]: I1205 20:14:18.063853 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:14:19 crc kubenswrapper[4744]: I1205 20:14:19.116929 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7fvql" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" containerName="registry-server" probeResult="failure" output=< Dec 05 20:14:19 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Dec 05 20:14:19 crc kubenswrapper[4744]: > Dec 05 20:14:19 crc kubenswrapper[4744]: I1205 20:14:19.806579 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:14:19 crc kubenswrapper[4744]: I1205 20:14:19.806954 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:14:19 crc kubenswrapper[4744]: I1205 20:14:19.807007 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:14:19 crc kubenswrapper[4744]: I1205 20:14:19.807665 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b"} pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:14:19 crc kubenswrapper[4744]: I1205 20:14:19.807738 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" containerID="cri-o://121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b" gracePeriod=600 Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.470827 4744 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.471510 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0787d1-231e-453f-8f0a-09804298f1db" containerName="registry-server" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.471616 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0787d1-231e-453f-8f0a-09804298f1db" containerName="registry-server" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.471707 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0787d1-231e-453f-8f0a-09804298f1db" containerName="extract-content" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.471793 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0787d1-231e-453f-8f0a-09804298f1db" containerName="extract-content" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.471882 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" containerName="extract-utilities" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.471966 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" containerName="extract-utilities" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.472050 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0787d1-231e-453f-8f0a-09804298f1db" containerName="extract-utilities" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.472134 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0787d1-231e-453f-8f0a-09804298f1db" containerName="extract-utilities" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.472214 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" containerName="registry-server" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.472285 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" containerName="registry-server" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.472395 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" containerName="extract-content" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.472474 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" containerName="extract-content" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.472555 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b06f2a4f-7424-477d-b47a-9d71ce3cdd21" containerName="extract-utilities" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.472629 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b06f2a4f-7424-477d-b47a-9d71ce3cdd21" containerName="extract-utilities" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.472710 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="965bad0f-5d6e-424b-a991-e08173625864" containerName="pruner" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.472723 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="965bad0f-5d6e-424b-a991-e08173625864" containerName="pruner" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.472735 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b06f2a4f-7424-477d-b47a-9d71ce3cdd21" containerName="extract-content" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.472743 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b06f2a4f-7424-477d-b47a-9d71ce3cdd21" containerName="extract-content" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.472753 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b06f2a4f-7424-477d-b47a-9d71ce3cdd21" containerName="registry-server" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.472760 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b06f2a4f-7424-477d-b47a-9d71ce3cdd21" containerName="registry-server" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.472899 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="965bad0f-5d6e-424b-a991-e08173625864" containerName="pruner" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.472912 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef4860d5-e9ab-4e8e-8b12-b5f004ea40db" containerName="registry-server" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.472923 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b06f2a4f-7424-477d-b47a-9d71ce3cdd21" containerName="registry-server" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.472931 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b0787d1-231e-453f-8f0a-09804298f1db" containerName="registry-server" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473282 4744 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473322 4744 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.473450 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473461 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.473473 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473480 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.473490 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473497 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.473509 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473516 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.473529 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473537 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.473551 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473558 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.473568 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473576 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473522 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473705 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473719 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473734 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473745 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473754 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473765 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473913 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8" gracePeriod=15 Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473983 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5" gracePeriod=15 Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.473988 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb" gracePeriod=15 Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.474050 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8" gracePeriod=15 Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.474194 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989" gracePeriod=15 Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.482958 4744 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.527347 4744 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.51:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.671628 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.671675 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.671690 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.671716 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.671802 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.671822 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.671850 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.671866 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.772816 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.772962 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.773227 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.773516 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.773251 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.773559 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.773899 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.774014 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.774034 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.774084 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.774234 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.774536 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.774675 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.774604 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.774266 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.774794 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: I1205 20:14:20.828090 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:20 crc kubenswrapper[4744]: W1205 20:14:20.844986 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-76f7ab1216becccc24644c7486ff722d14166597e40d7ff94e68577aa50519c5 WatchSource:0}: Error finding container 76f7ab1216becccc24644c7486ff722d14166597e40d7ff94e68577aa50519c5: Status 404 returned error can't find the container with id 76f7ab1216becccc24644c7486ff722d14166597e40d7ff94e68577aa50519c5 Dec 05 20:14:20 crc kubenswrapper[4744]: E1205 20:14:20.848356 4744 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.51:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e6aead7968f0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 20:14:20.847673103 +0000 UTC m=+231.077484481,LastTimestamp:2025-12-05 20:14:20.847673103 +0000 UTC m=+231.077484481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 20:14:21 crc kubenswrapper[4744]: I1205 20:14:21.198125 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 20:14:21 crc kubenswrapper[4744]: I1205 20:14:21.199841 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:14:21 crc kubenswrapper[4744]: I1205 20:14:21.200529 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5" exitCode=0 Dec 05 20:14:21 crc kubenswrapper[4744]: I1205 20:14:21.200574 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb" exitCode=0 Dec 05 20:14:21 crc kubenswrapper[4744]: I1205 20:14:21.200591 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8" exitCode=0 Dec 05 20:14:21 crc kubenswrapper[4744]: I1205 20:14:21.200604 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989" exitCode=2 Dec 05 20:14:21 crc kubenswrapper[4744]: I1205 20:14:21.200612 4744 scope.go:117] "RemoveContainer" containerID="32bacb176eef6aa8f7fe2f0a735be3ec60447211d27dc7f8178bdf657b198ea1" Dec 05 20:14:21 crc kubenswrapper[4744]: I1205 20:14:21.202023 4744 generic.go:334] "Generic (PLEG): container finished" podID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerID="121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b" exitCode=0 Dec 05 20:14:21 crc kubenswrapper[4744]: I1205 20:14:21.202093 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerDied","Data":"121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b"} Dec 05 20:14:21 crc kubenswrapper[4744]: I1205 20:14:21.203057 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"76f7ab1216becccc24644c7486ff722d14166597e40d7ff94e68577aa50519c5"} Dec 05 20:14:22 crc kubenswrapper[4744]: I1205 20:14:22.211541 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0cf137f1a23b3bf487c13638e43ad0d667a457ef12fdeefbbf9c5c36c6503eca"} Dec 05 20:14:22 crc kubenswrapper[4744]: E1205 20:14:22.212423 4744 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.51:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:22 crc kubenswrapper[4744]: I1205 20:14:22.214933 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:14:22 crc kubenswrapper[4744]: I1205 20:14:22.221935 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerStarted","Data":"d9a82750b0c52c0406985c221fbcd15515963a387f45fb587ee836a849ecce2f"} Dec 05 20:14:22 crc kubenswrapper[4744]: I1205 20:14:22.222615 4744 status_manager.go:851] "Failed to get status for pod" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bkhvd\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:22 crc kubenswrapper[4744]: I1205 20:14:22.224184 4744 generic.go:334] "Generic (PLEG): container finished" podID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" containerID="669fc7d991370efcd0db4a16a35177c646d47f053f84b4bc658d24788e2ffac7" exitCode=0 Dec 05 20:14:22 crc kubenswrapper[4744]: I1205 20:14:22.224219 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41","Type":"ContainerDied","Data":"669fc7d991370efcd0db4a16a35177c646d47f053f84b4bc658d24788e2ffac7"} Dec 05 20:14:22 crc kubenswrapper[4744]: I1205 20:14:22.224811 4744 status_manager.go:851] "Failed to get status for pod" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bkhvd\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:22 crc kubenswrapper[4744]: I1205 20:14:22.225494 4744 status_manager.go:851] "Failed to get status for pod" podUID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.121674 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.122687 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.123433 4744 status_manager.go:851] "Failed to get status for pod" podUID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.123657 4744 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.123886 4744 status_manager.go:851] "Failed to get status for pod" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bkhvd\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.231735 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.233553 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8" exitCode=0 Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.233708 4744 scope.go:117] "RemoveContainer" containerID="83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.233719 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:23 crc kubenswrapper[4744]: E1205 20:14:23.234381 4744 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.51:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.253397 4744 scope.go:117] "RemoveContainer" containerID="9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.266948 4744 scope.go:117] "RemoveContainer" containerID="cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.285870 4744 scope.go:117] "RemoveContainer" containerID="62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.299151 4744 scope.go:117] "RemoveContainer" containerID="488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.305680 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.305841 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.305956 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.306148 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.306008 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.306186 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.307004 4744 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.307039 4744 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.307059 4744 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.323942 4744 scope.go:117] "RemoveContainer" containerID="dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.347456 4744 scope.go:117] "RemoveContainer" containerID="83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5" Dec 05 20:14:23 crc kubenswrapper[4744]: E1205 20:14:23.349944 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\": container with ID starting with 83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5 not found: ID does not exist" containerID="83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.349993 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5"} err="failed to get container status \"83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\": rpc error: code = NotFound desc = could not find container \"83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5\": container with ID starting with 83d97edce63fa0bed758c95ffef33dd22623857e4b84f0e795c9fc142c9dcad5 not found: ID does not exist" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.350055 4744 scope.go:117] "RemoveContainer" containerID="9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb" Dec 05 20:14:23 crc kubenswrapper[4744]: E1205 20:14:23.351632 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\": container with ID starting with 9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb not found: ID does not exist" containerID="9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.351700 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb"} err="failed to get container status \"9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\": rpc error: code = NotFound desc = could not find container \"9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb\": container with ID starting with 9c3f29158076e515e8251a0205a5af00afeb778a898a793138d923f8a4f2ceeb not found: ID does not exist" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.351746 4744 scope.go:117] "RemoveContainer" containerID="cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8" Dec 05 20:14:23 crc kubenswrapper[4744]: E1205 20:14:23.352416 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\": container with ID starting with cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8 not found: ID does not exist" containerID="cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.352461 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8"} err="failed to get container status \"cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\": rpc error: code = NotFound desc = could not find container \"cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8\": container with ID starting with cbb5dca01fc72145beaaec80d521b60c04ff0dd34986e3cce6db5696613095e8 not found: ID does not exist" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.352493 4744 scope.go:117] "RemoveContainer" containerID="62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989" Dec 05 20:14:23 crc kubenswrapper[4744]: E1205 20:14:23.352865 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\": container with ID starting with 62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989 not found: ID does not exist" containerID="62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.352909 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989"} err="failed to get container status \"62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\": rpc error: code = NotFound desc = could not find container \"62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989\": container with ID starting with 62b00f3bfe46243102d0c3c531803a33085048c7ebbcfac70c2ef261db1d6989 not found: ID does not exist" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.352938 4744 scope.go:117] "RemoveContainer" containerID="488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8" Dec 05 20:14:23 crc kubenswrapper[4744]: E1205 20:14:23.353270 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\": container with ID starting with 488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8 not found: ID does not exist" containerID="488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.353325 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8"} err="failed to get container status \"488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\": rpc error: code = NotFound desc = could not find container \"488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8\": container with ID starting with 488a1fa0533dca326d39af735cb43f0a9d85b1bd44228ef0aec954f563537bf8 not found: ID does not exist" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.353350 4744 scope.go:117] "RemoveContainer" containerID="dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f" Dec 05 20:14:23 crc kubenswrapper[4744]: E1205 20:14:23.353644 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\": container with ID starting with dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f not found: ID does not exist" containerID="dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.353684 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f"} err="failed to get container status \"dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\": rpc error: code = NotFound desc = could not find container \"dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f\": container with ID starting with dcc4ac30d4394cdc56fd3fab6e87ff7dc12dda245c54ccae11450e17b02f542f not found: ID does not exist" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.457713 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.458145 4744 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.458744 4744 status_manager.go:851] "Failed to get status for pod" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bkhvd\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.459016 4744 status_manager.go:851] "Failed to get status for pod" podUID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.546630 4744 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.546953 4744 status_manager.go:851] "Failed to get status for pod" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bkhvd\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.547589 4744 status_manager.go:851] "Failed to get status for pod" podUID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.609954 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-kubelet-dir\") pod \"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41\" (UID: \"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41\") " Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.610236 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-kube-api-access\") pod \"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41\" (UID: \"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41\") " Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.610361 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-var-lock\") pod \"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41\" (UID: \"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41\") " Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.610443 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" (UID: "ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.610590 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-var-lock" (OuterVolumeSpecName: "var-lock") pod "ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" (UID: "ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.610794 4744 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.610888 4744 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.617743 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" (UID: "ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:14:23 crc kubenswrapper[4744]: I1205 20:14:23.712020 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:24 crc kubenswrapper[4744]: I1205 20:14:24.092481 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 05 20:14:24 crc kubenswrapper[4744]: I1205 20:14:24.242021 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41","Type":"ContainerDied","Data":"9e96bf17703cfccafdc1a091a8c2f74b45fb04e386de16dc811595008f1c95ac"} Dec 05 20:14:24 crc kubenswrapper[4744]: I1205 20:14:24.243283 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e96bf17703cfccafdc1a091a8c2f74b45fb04e386de16dc811595008f1c95ac" Dec 05 20:14:24 crc kubenswrapper[4744]: I1205 20:14:24.242067 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 20:14:24 crc kubenswrapper[4744]: I1205 20:14:24.246428 4744 status_manager.go:851] "Failed to get status for pod" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bkhvd\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:24 crc kubenswrapper[4744]: I1205 20:14:24.246919 4744 status_manager.go:851] "Failed to get status for pod" podUID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:24 crc kubenswrapper[4744]: I1205 20:14:24.674078 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:14:24 crc kubenswrapper[4744]: I1205 20:14:24.675149 4744 status_manager.go:851] "Failed to get status for pod" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bkhvd\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:24 crc kubenswrapper[4744]: I1205 20:14:24.676009 4744 status_manager.go:851] "Failed to get status for pod" podUID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:24 crc kubenswrapper[4744]: I1205 20:14:24.676465 4744 status_manager.go:851] "Failed to get status for pod" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" pod="openshift-marketplace/community-operators-5x6x6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5x6x6\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:25 crc kubenswrapper[4744]: E1205 20:14:25.339258 4744 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.51:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e6aead7968f0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 20:14:20.847673103 +0000 UTC m=+231.077484481,LastTimestamp:2025-12-05 20:14:20.847673103 +0000 UTC m=+231.077484481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 20:14:26 crc kubenswrapper[4744]: I1205 20:14:26.932196 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:14:26 crc kubenswrapper[4744]: I1205 20:14:26.933129 4744 status_manager.go:851] "Failed to get status for pod" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bkhvd\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:26 crc kubenswrapper[4744]: I1205 20:14:26.933760 4744 status_manager.go:851] "Failed to get status for pod" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" pod="openshift-marketplace/redhat-marketplace-tclr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tclr2\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:26 crc kubenswrapper[4744]: I1205 20:14:26.934215 4744 status_manager.go:851] "Failed to get status for pod" podUID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:26 crc kubenswrapper[4744]: I1205 20:14:26.934776 4744 status_manager.go:851] "Failed to get status for pod" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" pod="openshift-marketplace/community-operators-5x6x6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5x6x6\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:28 crc kubenswrapper[4744]: I1205 20:14:28.136751 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:14:28 crc kubenswrapper[4744]: I1205 20:14:28.138405 4744 status_manager.go:851] "Failed to get status for pod" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bkhvd\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:28 crc kubenswrapper[4744]: I1205 20:14:28.138897 4744 status_manager.go:851] "Failed to get status for pod" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" pod="openshift-marketplace/redhat-marketplace-tclr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tclr2\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:28 crc kubenswrapper[4744]: I1205 20:14:28.139671 4744 status_manager.go:851] "Failed to get status for pod" podUID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:28 crc kubenswrapper[4744]: I1205 20:14:28.140251 4744 status_manager.go:851] "Failed to get status for pod" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" pod="openshift-marketplace/community-operators-5x6x6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5x6x6\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:28 crc kubenswrapper[4744]: I1205 20:14:28.140792 4744 status_manager.go:851] "Failed to get status for pod" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" pod="openshift-marketplace/redhat-operators-7fvql" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7fvql\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:28 crc kubenswrapper[4744]: I1205 20:14:28.203808 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:14:28 crc kubenswrapper[4744]: I1205 20:14:28.204715 4744 status_manager.go:851] "Failed to get status for pod" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bkhvd\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:28 crc kubenswrapper[4744]: I1205 20:14:28.205332 4744 status_manager.go:851] "Failed to get status for pod" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" pod="openshift-marketplace/redhat-marketplace-tclr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tclr2\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:28 crc kubenswrapper[4744]: I1205 20:14:28.205715 4744 status_manager.go:851] "Failed to get status for pod" podUID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:28 crc kubenswrapper[4744]: I1205 20:14:28.206201 4744 status_manager.go:851] "Failed to get status for pod" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" pod="openshift-marketplace/community-operators-5x6x6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5x6x6\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:28 crc kubenswrapper[4744]: I1205 20:14:28.206743 4744 status_manager.go:851] "Failed to get status for pod" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" pod="openshift-marketplace/redhat-operators-7fvql" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7fvql\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:30 crc kubenswrapper[4744]: I1205 20:14:30.088468 4744 status_manager.go:851] "Failed to get status for pod" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" pod="openshift-marketplace/redhat-operators-7fvql" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7fvql\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:30 crc kubenswrapper[4744]: I1205 20:14:30.088890 4744 status_manager.go:851] "Failed to get status for pod" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bkhvd\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:30 crc kubenswrapper[4744]: I1205 20:14:30.089189 4744 status_manager.go:851] "Failed to get status for pod" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" pod="openshift-marketplace/redhat-marketplace-tclr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tclr2\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:30 crc kubenswrapper[4744]: I1205 20:14:30.089808 4744 status_manager.go:851] "Failed to get status for pod" podUID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:30 crc kubenswrapper[4744]: I1205 20:14:30.090501 4744 status_manager.go:851] "Failed to get status for pod" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" pod="openshift-marketplace/community-operators-5x6x6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5x6x6\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:30 crc kubenswrapper[4744]: E1205 20:14:30.320882 4744 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:30 crc kubenswrapper[4744]: E1205 20:14:30.321619 4744 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:30 crc kubenswrapper[4744]: E1205 20:14:30.322189 4744 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:30 crc kubenswrapper[4744]: E1205 20:14:30.322743 4744 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:30 crc kubenswrapper[4744]: E1205 20:14:30.323144 4744 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:30 crc kubenswrapper[4744]: I1205 20:14:30.323201 4744 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 05 20:14:30 crc kubenswrapper[4744]: E1205 20:14:30.323683 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="200ms" Dec 05 20:14:30 crc kubenswrapper[4744]: E1205 20:14:30.524939 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="400ms" Dec 05 20:14:30 crc kubenswrapper[4744]: E1205 20:14:30.926488 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="800ms" Dec 05 20:14:31 crc kubenswrapper[4744]: I1205 20:14:31.080554 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:31 crc kubenswrapper[4744]: I1205 20:14:31.082232 4744 status_manager.go:851] "Failed to get status for pod" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bkhvd\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:31 crc kubenswrapper[4744]: I1205 20:14:31.082842 4744 status_manager.go:851] "Failed to get status for pod" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" pod="openshift-marketplace/redhat-marketplace-tclr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tclr2\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:31 crc kubenswrapper[4744]: I1205 20:14:31.083407 4744 status_manager.go:851] "Failed to get status for pod" podUID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:31 crc kubenswrapper[4744]: I1205 20:14:31.083792 4744 status_manager.go:851] "Failed to get status for pod" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" pod="openshift-marketplace/community-operators-5x6x6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5x6x6\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:31 crc kubenswrapper[4744]: I1205 20:14:31.084201 4744 status_manager.go:851] "Failed to get status for pod" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" pod="openshift-marketplace/redhat-operators-7fvql" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7fvql\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:31 crc kubenswrapper[4744]: I1205 20:14:31.102692 4744 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f44d0ab9-2456-45d6-bb68-fbc933c751a1" Dec 05 20:14:31 crc kubenswrapper[4744]: I1205 20:14:31.102739 4744 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f44d0ab9-2456-45d6-bb68-fbc933c751a1" Dec 05 20:14:31 crc kubenswrapper[4744]: E1205 20:14:31.103205 4744 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:31 crc kubenswrapper[4744]: I1205 20:14:31.103835 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:31 crc kubenswrapper[4744]: I1205 20:14:31.293926 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e1acee7dca68c9f19b50636fca91e26c098e28872b58f03d12af274035628114"} Dec 05 20:14:31 crc kubenswrapper[4744]: E1205 20:14:31.727188 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="1.6s" Dec 05 20:14:32 crc kubenswrapper[4744]: I1205 20:14:32.300900 4744 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="2266a951d08d26100a355a758b0825dabf47668fe5af51b947428b961012f77e" exitCode=0 Dec 05 20:14:32 crc kubenswrapper[4744]: I1205 20:14:32.300952 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"2266a951d08d26100a355a758b0825dabf47668fe5af51b947428b961012f77e"} Dec 05 20:14:32 crc kubenswrapper[4744]: I1205 20:14:32.301353 4744 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f44d0ab9-2456-45d6-bb68-fbc933c751a1" Dec 05 20:14:32 crc kubenswrapper[4744]: I1205 20:14:32.301391 4744 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f44d0ab9-2456-45d6-bb68-fbc933c751a1" Dec 05 20:14:32 crc kubenswrapper[4744]: E1205 20:14:32.301895 4744 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:32 crc kubenswrapper[4744]: I1205 20:14:32.301939 4744 status_manager.go:851] "Failed to get status for pod" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" pod="openshift-marketplace/redhat-operators-7fvql" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7fvql\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:32 crc kubenswrapper[4744]: I1205 20:14:32.302551 4744 status_manager.go:851] "Failed to get status for pod" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bkhvd\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:32 crc kubenswrapper[4744]: I1205 20:14:32.303029 4744 status_manager.go:851] "Failed to get status for pod" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" pod="openshift-marketplace/redhat-marketplace-tclr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tclr2\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:32 crc kubenswrapper[4744]: I1205 20:14:32.303471 4744 status_manager.go:851] "Failed to get status for pod" podUID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:32 crc kubenswrapper[4744]: I1205 20:14:32.303797 4744 status_manager.go:851] "Failed to get status for pod" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" pod="openshift-marketplace/community-operators-5x6x6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5x6x6\": dial tcp 38.102.83.51:6443: connect: connection refused" Dec 05 20:14:33 crc kubenswrapper[4744]: I1205 20:14:33.313055 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8d34000961aede70371100d1fe9f252919f3a73d1bb5a4f5bb3f2d22900ec0f7"} Dec 05 20:14:33 crc kubenswrapper[4744]: I1205 20:14:33.313116 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a2cada680213265445934a759391f42e54af05633cbaca8c81becddc27ec0c34"} Dec 05 20:14:33 crc kubenswrapper[4744]: I1205 20:14:33.313129 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a67bacf422980c7c45c2a1d3a261ca6ddd095a8217751523fbb3c9debe80aa03"} Dec 05 20:14:33 crc kubenswrapper[4744]: I1205 20:14:33.313142 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2489fbf81d08d1c1b53a61b2c1af6c8e7881ac3368ee4f4d294497bdf8f04d0d"} Dec 05 20:14:33 crc kubenswrapper[4744]: I1205 20:14:33.316688 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 20:14:33 crc kubenswrapper[4744]: I1205 20:14:33.316732 4744 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5" exitCode=1 Dec 05 20:14:33 crc kubenswrapper[4744]: I1205 20:14:33.316758 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5"} Dec 05 20:14:33 crc kubenswrapper[4744]: I1205 20:14:33.317213 4744 scope.go:117] "RemoveContainer" containerID="f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5" Dec 05 20:14:33 crc kubenswrapper[4744]: I1205 20:14:33.349025 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:14:34 crc kubenswrapper[4744]: I1205 20:14:34.325170 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 20:14:34 crc kubenswrapper[4744]: I1205 20:14:34.325526 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"57bd4e3d730c702814d51c7b67aa61831bd4b7f114aba374d92516c6a93e1dc3"} Dec 05 20:14:34 crc kubenswrapper[4744]: I1205 20:14:34.328816 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"58fecdc83d5fd9000762c5d425329a1d6dd297735ec3ef13976daccb4ba8a1ec"} Dec 05 20:14:34 crc kubenswrapper[4744]: I1205 20:14:34.329038 4744 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f44d0ab9-2456-45d6-bb68-fbc933c751a1" Dec 05 20:14:34 crc kubenswrapper[4744]: I1205 20:14:34.329061 4744 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f44d0ab9-2456-45d6-bb68-fbc933c751a1" Dec 05 20:14:34 crc kubenswrapper[4744]: I1205 20:14:34.329267 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:36 crc kubenswrapper[4744]: I1205 20:14:36.104942 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:36 crc kubenswrapper[4744]: I1205 20:14:36.105389 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:36 crc kubenswrapper[4744]: I1205 20:14:36.116742 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:39 crc kubenswrapper[4744]: I1205 20:14:39.344385 4744 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:40 crc kubenswrapper[4744]: I1205 20:14:40.118078 4744 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod7b0787d1-231e-453f-8f0a-09804298f1db"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod7b0787d1-231e-453f-8f0a-09804298f1db] : Timed out while waiting for systemd to remove kubepods-burstable-pod7b0787d1_231e_453f_8f0a_09804298f1db.slice" Dec 05 20:14:40 crc kubenswrapper[4744]: I1205 20:14:40.127063 4744 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a2e87ec0-a56c-4c12-981b-df64a3dbec1c" Dec 05 20:14:40 crc kubenswrapper[4744]: I1205 20:14:40.374482 4744 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f44d0ab9-2456-45d6-bb68-fbc933c751a1" Dec 05 20:14:40 crc kubenswrapper[4744]: I1205 20:14:40.374517 4744 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f44d0ab9-2456-45d6-bb68-fbc933c751a1" Dec 05 20:14:40 crc kubenswrapper[4744]: I1205 20:14:40.378114 4744 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a2e87ec0-a56c-4c12-981b-df64a3dbec1c" Dec 05 20:14:40 crc kubenswrapper[4744]: I1205 20:14:40.379795 4744 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://2489fbf81d08d1c1b53a61b2c1af6c8e7881ac3368ee4f4d294497bdf8f04d0d" Dec 05 20:14:40 crc kubenswrapper[4744]: I1205 20:14:40.379828 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:14:41 crc kubenswrapper[4744]: I1205 20:14:41.378858 4744 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f44d0ab9-2456-45d6-bb68-fbc933c751a1" Dec 05 20:14:41 crc kubenswrapper[4744]: I1205 20:14:41.378885 4744 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f44d0ab9-2456-45d6-bb68-fbc933c751a1" Dec 05 20:14:41 crc kubenswrapper[4744]: I1205 20:14:41.382252 4744 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a2e87ec0-a56c-4c12-981b-df64a3dbec1c" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.322498 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" podUID="f9c687ae-84e1-44ed-801d-abbbff13acd9" containerName="oauth-openshift" containerID="cri-o://54793a0d17b32ddf541006453f42a34d0380e7c127b9e8c6f5d779cb10694b40" gracePeriod=15 Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.767694 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.869918 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-audit-policies\") pod \"f9c687ae-84e1-44ed-801d-abbbff13acd9\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.869982 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-router-certs\") pod \"f9c687ae-84e1-44ed-801d-abbbff13acd9\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.870023 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2btxm\" (UniqueName: \"kubernetes.io/projected/f9c687ae-84e1-44ed-801d-abbbff13acd9-kube-api-access-2btxm\") pod \"f9c687ae-84e1-44ed-801d-abbbff13acd9\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.870057 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-error\") pod \"f9c687ae-84e1-44ed-801d-abbbff13acd9\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.870084 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-session\") pod \"f9c687ae-84e1-44ed-801d-abbbff13acd9\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.870116 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-trusted-ca-bundle\") pod \"f9c687ae-84e1-44ed-801d-abbbff13acd9\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.870142 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f9c687ae-84e1-44ed-801d-abbbff13acd9-audit-dir\") pod \"f9c687ae-84e1-44ed-801d-abbbff13acd9\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.870172 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-ocp-branding-template\") pod \"f9c687ae-84e1-44ed-801d-abbbff13acd9\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.870218 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-serving-cert\") pod \"f9c687ae-84e1-44ed-801d-abbbff13acd9\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.870285 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-provider-selection\") pod \"f9c687ae-84e1-44ed-801d-abbbff13acd9\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.870338 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-service-ca\") pod \"f9c687ae-84e1-44ed-801d-abbbff13acd9\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.870372 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-login\") pod \"f9c687ae-84e1-44ed-801d-abbbff13acd9\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.870399 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-cliconfig\") pod \"f9c687ae-84e1-44ed-801d-abbbff13acd9\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.870421 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-idp-0-file-data\") pod \"f9c687ae-84e1-44ed-801d-abbbff13acd9\" (UID: \"f9c687ae-84e1-44ed-801d-abbbff13acd9\") " Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.870488 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f9c687ae-84e1-44ed-801d-abbbff13acd9" (UID: "f9c687ae-84e1-44ed-801d-abbbff13acd9"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.870553 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9c687ae-84e1-44ed-801d-abbbff13acd9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f9c687ae-84e1-44ed-801d-abbbff13acd9" (UID: "f9c687ae-84e1-44ed-801d-abbbff13acd9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.870634 4744 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.872212 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f9c687ae-84e1-44ed-801d-abbbff13acd9" (UID: "f9c687ae-84e1-44ed-801d-abbbff13acd9"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.877437 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f9c687ae-84e1-44ed-801d-abbbff13acd9" (UID: "f9c687ae-84e1-44ed-801d-abbbff13acd9"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.880552 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.884966 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f9c687ae-84e1-44ed-801d-abbbff13acd9" (UID: "f9c687ae-84e1-44ed-801d-abbbff13acd9"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.888486 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f9c687ae-84e1-44ed-801d-abbbff13acd9" (UID: "f9c687ae-84e1-44ed-801d-abbbff13acd9"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.897027 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c687ae-84e1-44ed-801d-abbbff13acd9-kube-api-access-2btxm" (OuterVolumeSpecName: "kube-api-access-2btxm") pod "f9c687ae-84e1-44ed-801d-abbbff13acd9" (UID: "f9c687ae-84e1-44ed-801d-abbbff13acd9"). InnerVolumeSpecName "kube-api-access-2btxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.897491 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f9c687ae-84e1-44ed-801d-abbbff13acd9" (UID: "f9c687ae-84e1-44ed-801d-abbbff13acd9"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.897760 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f9c687ae-84e1-44ed-801d-abbbff13acd9" (UID: "f9c687ae-84e1-44ed-801d-abbbff13acd9"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.902081 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f9c687ae-84e1-44ed-801d-abbbff13acd9" (UID: "f9c687ae-84e1-44ed-801d-abbbff13acd9"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.904591 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f9c687ae-84e1-44ed-801d-abbbff13acd9" (UID: "f9c687ae-84e1-44ed-801d-abbbff13acd9"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.904759 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f9c687ae-84e1-44ed-801d-abbbff13acd9" (UID: "f9c687ae-84e1-44ed-801d-abbbff13acd9"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.905747 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f9c687ae-84e1-44ed-801d-abbbff13acd9" (UID: "f9c687ae-84e1-44ed-801d-abbbff13acd9"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.906190 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f9c687ae-84e1-44ed-801d-abbbff13acd9" (UID: "f9c687ae-84e1-44ed-801d-abbbff13acd9"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.971484 4744 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f9c687ae-84e1-44ed-801d-abbbff13acd9-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.971530 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.971543 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.971556 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.971567 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.971577 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.971586 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.971594 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.971603 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.971610 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2btxm\" (UniqueName: \"kubernetes.io/projected/f9c687ae-84e1-44ed-801d-abbbff13acd9-kube-api-access-2btxm\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.971620 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.971629 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:42 crc kubenswrapper[4744]: I1205 20:14:42.971637 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9c687ae-84e1-44ed-801d-abbbff13acd9-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:14:43 crc kubenswrapper[4744]: I1205 20:14:43.348650 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:14:43 crc kubenswrapper[4744]: I1205 20:14:43.348750 4744 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 20:14:43 crc kubenswrapper[4744]: I1205 20:14:43.348783 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 20:14:43 crc kubenswrapper[4744]: I1205 20:14:43.393130 4744 generic.go:334] "Generic (PLEG): container finished" podID="f9c687ae-84e1-44ed-801d-abbbff13acd9" containerID="54793a0d17b32ddf541006453f42a34d0380e7c127b9e8c6f5d779cb10694b40" exitCode=0 Dec 05 20:14:43 crc kubenswrapper[4744]: I1205 20:14:43.393267 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" Dec 05 20:14:43 crc kubenswrapper[4744]: I1205 20:14:43.393359 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" event={"ID":"f9c687ae-84e1-44ed-801d-abbbff13acd9","Type":"ContainerDied","Data":"54793a0d17b32ddf541006453f42a34d0380e7c127b9e8c6f5d779cb10694b40"} Dec 05 20:14:43 crc kubenswrapper[4744]: I1205 20:14:43.393404 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dn5pv" event={"ID":"f9c687ae-84e1-44ed-801d-abbbff13acd9","Type":"ContainerDied","Data":"145066b4556f16257df7179e48587828e23fb99560333d69419e4c3f39cd8eac"} Dec 05 20:14:43 crc kubenswrapper[4744]: I1205 20:14:43.393432 4744 scope.go:117] "RemoveContainer" containerID="54793a0d17b32ddf541006453f42a34d0380e7c127b9e8c6f5d779cb10694b40" Dec 05 20:14:43 crc kubenswrapper[4744]: I1205 20:14:43.429834 4744 scope.go:117] "RemoveContainer" containerID="54793a0d17b32ddf541006453f42a34d0380e7c127b9e8c6f5d779cb10694b40" Dec 05 20:14:43 crc kubenswrapper[4744]: E1205 20:14:43.430631 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54793a0d17b32ddf541006453f42a34d0380e7c127b9e8c6f5d779cb10694b40\": container with ID starting with 54793a0d17b32ddf541006453f42a34d0380e7c127b9e8c6f5d779cb10694b40 not found: ID does not exist" containerID="54793a0d17b32ddf541006453f42a34d0380e7c127b9e8c6f5d779cb10694b40" Dec 05 20:14:43 crc kubenswrapper[4744]: I1205 20:14:43.430692 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54793a0d17b32ddf541006453f42a34d0380e7c127b9e8c6f5d779cb10694b40"} err="failed to get container status \"54793a0d17b32ddf541006453f42a34d0380e7c127b9e8c6f5d779cb10694b40\": rpc error: code = NotFound desc = could not find container \"54793a0d17b32ddf541006453f42a34d0380e7c127b9e8c6f5d779cb10694b40\": container with ID starting with 54793a0d17b32ddf541006453f42a34d0380e7c127b9e8c6f5d779cb10694b40 not found: ID does not exist" Dec 05 20:14:45 crc kubenswrapper[4744]: I1205 20:14:45.822127 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 20:14:46 crc kubenswrapper[4744]: I1205 20:14:46.373120 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 20:14:47 crc kubenswrapper[4744]: I1205 20:14:47.284704 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 20:14:47 crc kubenswrapper[4744]: I1205 20:14:47.483041 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 20:14:48 crc kubenswrapper[4744]: I1205 20:14:48.804880 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 20:14:48 crc kubenswrapper[4744]: I1205 20:14:48.982409 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 20:14:49 crc kubenswrapper[4744]: I1205 20:14:49.511552 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 20:14:49 crc kubenswrapper[4744]: I1205 20:14:49.544220 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 20:14:49 crc kubenswrapper[4744]: I1205 20:14:49.696769 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 20:14:50 crc kubenswrapper[4744]: I1205 20:14:50.564210 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 20:14:50 crc kubenswrapper[4744]: I1205 20:14:50.808641 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 20:14:51 crc kubenswrapper[4744]: I1205 20:14:51.627269 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 20:14:51 crc kubenswrapper[4744]: I1205 20:14:51.686085 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 20:14:51 crc kubenswrapper[4744]: I1205 20:14:51.733435 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 20:14:51 crc kubenswrapper[4744]: I1205 20:14:51.835592 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 20:14:52 crc kubenswrapper[4744]: I1205 20:14:52.381642 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 20:14:52 crc kubenswrapper[4744]: I1205 20:14:52.483635 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 20:14:53 crc kubenswrapper[4744]: I1205 20:14:53.072199 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 20:14:53 crc kubenswrapper[4744]: I1205 20:14:53.133159 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 20:14:53 crc kubenswrapper[4744]: I1205 20:14:53.349455 4744 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 20:14:53 crc kubenswrapper[4744]: I1205 20:14:53.349965 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 20:14:53 crc kubenswrapper[4744]: I1205 20:14:53.360531 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 20:14:53 crc kubenswrapper[4744]: I1205 20:14:53.498633 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 20:14:53 crc kubenswrapper[4744]: I1205 20:14:53.632927 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 20:14:53 crc kubenswrapper[4744]: I1205 20:14:53.634771 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 20:14:53 crc kubenswrapper[4744]: I1205 20:14:53.685001 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 20:14:53 crc kubenswrapper[4744]: I1205 20:14:53.686944 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 20:14:53 crc kubenswrapper[4744]: I1205 20:14:53.838041 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 20:14:53 crc kubenswrapper[4744]: I1205 20:14:53.862864 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 20:14:54 crc kubenswrapper[4744]: I1205 20:14:54.027960 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 20:14:54 crc kubenswrapper[4744]: I1205 20:14:54.052491 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 20:14:54 crc kubenswrapper[4744]: I1205 20:14:54.178192 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 20:14:54 crc kubenswrapper[4744]: I1205 20:14:54.292573 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 20:14:54 crc kubenswrapper[4744]: I1205 20:14:54.384941 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 20:14:54 crc kubenswrapper[4744]: I1205 20:14:54.398489 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 20:14:54 crc kubenswrapper[4744]: I1205 20:14:54.599338 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 20:14:54 crc kubenswrapper[4744]: I1205 20:14:54.610018 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 20:14:54 crc kubenswrapper[4744]: I1205 20:14:54.814973 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 20:14:54 crc kubenswrapper[4744]: I1205 20:14:54.883237 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 20:14:54 crc kubenswrapper[4744]: I1205 20:14:54.927179 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 20:14:54 crc kubenswrapper[4744]: I1205 20:14:54.932921 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 20:14:54 crc kubenswrapper[4744]: I1205 20:14:54.944946 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.031134 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.043927 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.136350 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.255967 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.266497 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.357014 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.477400 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.510650 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.653751 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.674798 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.768904 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.776711 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.866201 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.935533 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.947512 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 20:14:55 crc kubenswrapper[4744]: I1205 20:14:55.964641 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.091615 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.215980 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.230799 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.318667 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.363044 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.377359 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.406093 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.505972 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.541115 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.541683 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.628179 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.642843 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.686520 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.696681 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.750415 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.761502 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.772516 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.802907 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.808736 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.820436 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.826419 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.855909 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.881389 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.924578 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 20:14:56 crc kubenswrapper[4744]: I1205 20:14:56.928587 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.004189 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.091175 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.101655 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.117240 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.142769 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.221342 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.285982 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.293708 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.404027 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.415192 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.520629 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.558546 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.591613 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.659114 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.669763 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.785245 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.836714 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.853010 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.897106 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.897123 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 20:14:57 crc kubenswrapper[4744]: I1205 20:14:57.915801 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.103125 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.153185 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.159427 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.311313 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.315262 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.437669 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.456629 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.488149 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.524894 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.559562 4744 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.715646 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.721252 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.836828 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.867685 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.894399 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.902344 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 20:14:58 crc kubenswrapper[4744]: I1205 20:14:58.983392 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.174230 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.205937 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.229007 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.286053 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.322897 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.345231 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.398511 4744 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.411008 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.442556 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.514686 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.530331 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.557322 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.617879 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.641418 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.643094 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.656906 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.873463 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.879578 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.892861 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.895998 4744 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 20:14:59 crc kubenswrapper[4744]: I1205 20:14:59.907729 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.017450 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.067524 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.110162 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.136832 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.276087 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.298563 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.350977 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.491126 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.514737 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.531824 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.576928 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.620517 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.655050 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.678037 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.727252 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 20:15:00 crc kubenswrapper[4744]: I1205 20:15:00.943040 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 20:15:01 crc kubenswrapper[4744]: I1205 20:15:01.022729 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:15:01 crc kubenswrapper[4744]: I1205 20:15:01.197004 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 20:15:01 crc kubenswrapper[4744]: I1205 20:15:01.279962 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 20:15:01 crc kubenswrapper[4744]: I1205 20:15:01.427103 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 20:15:01 crc kubenswrapper[4744]: I1205 20:15:01.444808 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 20:15:01 crc kubenswrapper[4744]: I1205 20:15:01.458831 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 20:15:01 crc kubenswrapper[4744]: I1205 20:15:01.499133 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 20:15:01 crc kubenswrapper[4744]: I1205 20:15:01.602856 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 20:15:01 crc kubenswrapper[4744]: I1205 20:15:01.721103 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 20:15:01 crc kubenswrapper[4744]: I1205 20:15:01.872648 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 20:15:01 crc kubenswrapper[4744]: I1205 20:15:01.884917 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 20:15:01 crc kubenswrapper[4744]: I1205 20:15:01.954858 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 20:15:02 crc kubenswrapper[4744]: I1205 20:15:02.011971 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 20:15:02 crc kubenswrapper[4744]: I1205 20:15:02.037605 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 20:15:02 crc kubenswrapper[4744]: I1205 20:15:02.269620 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 20:15:02 crc kubenswrapper[4744]: I1205 20:15:02.408131 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 20:15:02 crc kubenswrapper[4744]: I1205 20:15:02.547993 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 20:15:02 crc kubenswrapper[4744]: I1205 20:15:02.586814 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 20:15:02 crc kubenswrapper[4744]: I1205 20:15:02.755827 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 20:15:02 crc kubenswrapper[4744]: I1205 20:15:02.782115 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 20:15:02 crc kubenswrapper[4744]: I1205 20:15:02.825212 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 20:15:02 crc kubenswrapper[4744]: I1205 20:15:02.832643 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 20:15:02 crc kubenswrapper[4744]: I1205 20:15:02.881653 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 20:15:02 crc kubenswrapper[4744]: I1205 20:15:02.911169 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 20:15:02 crc kubenswrapper[4744]: I1205 20:15:02.997540 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.004048 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.046352 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.082202 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.108282 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.130181 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.183562 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.185958 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.262860 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.349348 4744 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.349400 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.349451 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.350092 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"57bd4e3d730c702814d51c7b67aa61831bd4b7f114aba374d92516c6a93e1dc3"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.350353 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://57bd4e3d730c702814d51c7b67aa61831bd4b7f114aba374d92516c6a93e1dc3" gracePeriod=30 Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.384494 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.492408 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.547017 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.603028 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.604418 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.709940 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.890187 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 20:15:03 crc kubenswrapper[4744]: I1205 20:15:03.925971 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.038255 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.055896 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.063193 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.089076 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.117344 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.141787 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.240072 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.241699 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.272046 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.290215 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.302771 4744 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.317983 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.390164 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.441584 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.668677 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.676560 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.676662 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.761745 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.895051 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 20:15:04 crc kubenswrapper[4744]: I1205 20:15:04.899558 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 20:15:05 crc kubenswrapper[4744]: I1205 20:15:05.114687 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 20:15:05 crc kubenswrapper[4744]: I1205 20:15:05.309944 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 20:15:05 crc kubenswrapper[4744]: I1205 20:15:05.340166 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 20:15:05 crc kubenswrapper[4744]: I1205 20:15:05.417856 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 20:15:05 crc kubenswrapper[4744]: I1205 20:15:05.499029 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 20:15:05 crc kubenswrapper[4744]: I1205 20:15:05.673022 4744 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 20:15:05 crc kubenswrapper[4744]: I1205 20:15:05.753119 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 20:15:06 crc kubenswrapper[4744]: I1205 20:15:06.049992 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 20:15:06 crc kubenswrapper[4744]: I1205 20:15:06.064468 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 20:15:06 crc kubenswrapper[4744]: I1205 20:15:06.328503 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 20:15:06 crc kubenswrapper[4744]: I1205 20:15:06.448753 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 20:15:06 crc kubenswrapper[4744]: I1205 20:15:06.733759 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 20:15:06 crc kubenswrapper[4744]: I1205 20:15:06.997997 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.006102 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.192192 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.275911 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.375383 4744 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.383018 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dn5pv","openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.383136 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-69b55d54f6-spbxm"] Dec 05 20:15:07 crc kubenswrapper[4744]: E1205 20:15:07.383457 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c687ae-84e1-44ed-801d-abbbff13acd9" containerName="oauth-openshift" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.383486 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c687ae-84e1-44ed-801d-abbbff13acd9" containerName="oauth-openshift" Dec 05 20:15:07 crc kubenswrapper[4744]: E1205 20:15:07.383511 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" containerName="installer" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.383524 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" containerName="installer" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.383706 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c687ae-84e1-44ed-801d-abbbff13acd9" containerName="oauth-openshift" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.383708 4744 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f44d0ab9-2456-45d6-bb68-fbc933c751a1" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.383748 4744 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f44d0ab9-2456-45d6-bb68-fbc933c751a1" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.383733 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab4cb2ea-f897-4ccf-888a-5c1eca4e4c41" containerName="installer" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.384479 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.387785 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.388268 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.389163 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.389179 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.389195 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.389269 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.391750 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.391751 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.392087 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.392356 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.393756 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.393831 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.393879 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.406876 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.408915 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.421948 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=28.42192292 podStartE2EDuration="28.42192292s" podCreationTimestamp="2025-12-05 20:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:15:07.421271664 +0000 UTC m=+277.651083032" watchObservedRunningTime="2025-12-05 20:15:07.42192292 +0000 UTC m=+277.651734318" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.422220 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.579451 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-user-template-login\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.579524 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.579555 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.579590 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.579626 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-router-certs\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.579655 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfcjg\" (UniqueName: \"kubernetes.io/projected/ddd129aa-309f-43fe-820a-d58a65d5392b-kube-api-access-dfcjg\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.579679 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-session\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.579705 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ddd129aa-309f-43fe-820a-d58a65d5392b-audit-dir\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.579736 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ddd129aa-309f-43fe-820a-d58a65d5392b-audit-policies\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.579759 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.579826 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.579868 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-user-template-error\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.579891 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.579947 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-service-ca\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.680840 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-session\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.680911 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ddd129aa-309f-43fe-820a-d58a65d5392b-audit-dir\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.680975 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ddd129aa-309f-43fe-820a-d58a65d5392b-audit-policies\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.681014 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.681027 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ddd129aa-309f-43fe-820a-d58a65d5392b-audit-dir\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.681051 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.681132 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-user-template-error\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.681175 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.681238 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-service-ca\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.681317 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-user-template-login\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.681399 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.681454 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.681497 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.681541 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-router-certs\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.681577 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfcjg\" (UniqueName: \"kubernetes.io/projected/ddd129aa-309f-43fe-820a-d58a65d5392b-kube-api-access-dfcjg\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.682667 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-service-ca\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.683132 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ddd129aa-309f-43fe-820a-d58a65d5392b-audit-policies\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.683921 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.686665 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.688375 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.688382 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-router-certs\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.689715 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.690142 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-session\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.690161 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-user-template-login\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.691175 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-user-template-error\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.692121 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.692683 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ddd129aa-309f-43fe-820a-d58a65d5392b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.713232 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfcjg\" (UniqueName: \"kubernetes.io/projected/ddd129aa-309f-43fe-820a-d58a65d5392b-kube-api-access-dfcjg\") pod \"oauth-openshift-69b55d54f6-spbxm\" (UID: \"ddd129aa-309f-43fe-820a-d58a65d5392b\") " pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.729217 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:07 crc kubenswrapper[4744]: I1205 20:15:07.947455 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-69b55d54f6-spbxm"] Dec 05 20:15:08 crc kubenswrapper[4744]: I1205 20:15:08.032000 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 20:15:08 crc kubenswrapper[4744]: I1205 20:15:08.093011 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c687ae-84e1-44ed-801d-abbbff13acd9" path="/var/lib/kubelet/pods/f9c687ae-84e1-44ed-801d-abbbff13acd9/volumes" Dec 05 20:15:08 crc kubenswrapper[4744]: I1205 20:15:08.569090 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" event={"ID":"ddd129aa-309f-43fe-820a-d58a65d5392b","Type":"ContainerStarted","Data":"62ff5ad552f3cab77edf5e0ba6d8ddcf9458ef8d76f1c181fb766663dc81034e"} Dec 05 20:15:08 crc kubenswrapper[4744]: I1205 20:15:08.569156 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" event={"ID":"ddd129aa-309f-43fe-820a-d58a65d5392b","Type":"ContainerStarted","Data":"fde5155e79b7c998ff4d8080ba7804b348b813d5f9202e317827407cee538a8f"} Dec 05 20:15:08 crc kubenswrapper[4744]: I1205 20:15:08.569622 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:08 crc kubenswrapper[4744]: I1205 20:15:08.601548 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" podStartSLOduration=51.601511015 podStartE2EDuration="51.601511015s" podCreationTimestamp="2025-12-05 20:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:15:08.600539129 +0000 UTC m=+278.830350537" watchObservedRunningTime="2025-12-05 20:15:08.601511015 +0000 UTC m=+278.831322423" Dec 05 20:15:08 crc kubenswrapper[4744]: I1205 20:15:08.976052 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-69b55d54f6-spbxm" Dec 05 20:15:09 crc kubenswrapper[4744]: I1205 20:15:09.335087 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 20:15:09 crc kubenswrapper[4744]: I1205 20:15:09.624478 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 20:15:13 crc kubenswrapper[4744]: I1205 20:15:12.997714 4744 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 20:15:13 crc kubenswrapper[4744]: I1205 20:15:12.998448 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://0cf137f1a23b3bf487c13638e43ad0d667a457ef12fdeefbbf9c5c36c6503eca" gracePeriod=5 Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.604919 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.605342 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.642099 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.642175 4744 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="0cf137f1a23b3bf487c13638e43ad0d667a457ef12fdeefbbf9c5c36c6503eca" exitCode=137 Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.642243 4744 scope.go:117] "RemoveContainer" containerID="0cf137f1a23b3bf487c13638e43ad0d667a457ef12fdeefbbf9c5c36c6503eca" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.642270 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.661840 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.661900 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.661917 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.661984 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.662057 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.662181 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.662576 4744 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.662647 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.662705 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.662749 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.666724 4744 scope.go:117] "RemoveContainer" containerID="0cf137f1a23b3bf487c13638e43ad0d667a457ef12fdeefbbf9c5c36c6503eca" Dec 05 20:15:18 crc kubenswrapper[4744]: E1205 20:15:18.667628 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf137f1a23b3bf487c13638e43ad0d667a457ef12fdeefbbf9c5c36c6503eca\": container with ID starting with 0cf137f1a23b3bf487c13638e43ad0d667a457ef12fdeefbbf9c5c36c6503eca not found: ID does not exist" containerID="0cf137f1a23b3bf487c13638e43ad0d667a457ef12fdeefbbf9c5c36c6503eca" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.667684 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf137f1a23b3bf487c13638e43ad0d667a457ef12fdeefbbf9c5c36c6503eca"} err="failed to get container status \"0cf137f1a23b3bf487c13638e43ad0d667a457ef12fdeefbbf9c5c36c6503eca\": rpc error: code = NotFound desc = could not find container \"0cf137f1a23b3bf487c13638e43ad0d667a457ef12fdeefbbf9c5c36c6503eca\": container with ID starting with 0cf137f1a23b3bf487c13638e43ad0d667a457ef12fdeefbbf9c5c36c6503eca not found: ID does not exist" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.676660 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.764094 4744 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.764153 4744 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.764174 4744 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:18 crc kubenswrapper[4744]: I1205 20:15:18.764193 4744 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:20 crc kubenswrapper[4744]: I1205 20:15:20.098894 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 05 20:15:28 crc kubenswrapper[4744]: I1205 20:15:28.368266 4744 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tr74j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 05 20:15:28 crc kubenswrapper[4744]: I1205 20:15:28.368354 4744 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tr74j container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 05 20:15:28 crc kubenswrapper[4744]: I1205 20:15:28.368854 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" podUID="8bfdca92-a782-4806-a2c0-e54302fd24a4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 05 20:15:28 crc kubenswrapper[4744]: I1205 20:15:28.368919 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" podUID="8bfdca92-a782-4806-a2c0-e54302fd24a4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 05 20:15:28 crc kubenswrapper[4744]: I1205 20:15:28.722385 4744 generic.go:334] "Generic (PLEG): container finished" podID="8bfdca92-a782-4806-a2c0-e54302fd24a4" containerID="9c7bf640e7b8f0575889e7daa44788ec557b7392eff3dc1d69a5c390688c4e1d" exitCode=0 Dec 05 20:15:28 crc kubenswrapper[4744]: I1205 20:15:28.722462 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" event={"ID":"8bfdca92-a782-4806-a2c0-e54302fd24a4","Type":"ContainerDied","Data":"9c7bf640e7b8f0575889e7daa44788ec557b7392eff3dc1d69a5c390688c4e1d"} Dec 05 20:15:28 crc kubenswrapper[4744]: I1205 20:15:28.723389 4744 scope.go:117] "RemoveContainer" containerID="9c7bf640e7b8f0575889e7daa44788ec557b7392eff3dc1d69a5c390688c4e1d" Dec 05 20:15:29 crc kubenswrapper[4744]: I1205 20:15:29.914185 4744 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 05 20:15:30 crc kubenswrapper[4744]: I1205 20:15:30.738812 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" event={"ID":"8bfdca92-a782-4806-a2c0-e54302fd24a4","Type":"ContainerStarted","Data":"d31e745c91e3262ef9a281821a591e85878133c5354b36cf9d09d82c8aff0123"} Dec 05 20:15:31 crc kubenswrapper[4744]: I1205 20:15:31.745718 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:15:31 crc kubenswrapper[4744]: I1205 20:15:31.748398 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:15:33 crc kubenswrapper[4744]: I1205 20:15:33.758737 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 20:15:33 crc kubenswrapper[4744]: I1205 20:15:33.760420 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 20:15:33 crc kubenswrapper[4744]: I1205 20:15:33.760462 4744 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="57bd4e3d730c702814d51c7b67aa61831bd4b7f114aba374d92516c6a93e1dc3" exitCode=137 Dec 05 20:15:33 crc kubenswrapper[4744]: I1205 20:15:33.760539 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"57bd4e3d730c702814d51c7b67aa61831bd4b7f114aba374d92516c6a93e1dc3"} Dec 05 20:15:33 crc kubenswrapper[4744]: I1205 20:15:33.760625 4744 scope.go:117] "RemoveContainer" containerID="f7f54af6e6a60f5ffcc3820dbe2656632bc3be66be693c0aa4c7c4649d60e3c5" Dec 05 20:15:34 crc kubenswrapper[4744]: I1205 20:15:34.770575 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 20:15:34 crc kubenswrapper[4744]: I1205 20:15:34.772181 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d23f068de5656b15ee536146ca31f80d80dcb1ad7922f42407d1108d07a5fef2"} Dec 05 20:15:42 crc kubenswrapper[4744]: I1205 20:15:42.874917 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:15:43 crc kubenswrapper[4744]: I1205 20:15:43.348738 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:15:43 crc kubenswrapper[4744]: I1205 20:15:43.353202 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:15:44 crc kubenswrapper[4744]: I1205 20:15:44.847550 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 20:15:48 crc kubenswrapper[4744]: I1205 20:15:48.502909 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7fvql"] Dec 05 20:15:48 crc kubenswrapper[4744]: I1205 20:15:48.504724 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7fvql" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" containerName="registry-server" containerID="cri-o://9c72da9e9f9915bf9bdc0d19779b490a23f097f6ab26740670049faa9f00ed24" gracePeriod=2 Dec 05 20:15:48 crc kubenswrapper[4744]: I1205 20:15:48.869271 4744 generic.go:334] "Generic (PLEG): container finished" podID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" containerID="9c72da9e9f9915bf9bdc0d19779b490a23f097f6ab26740670049faa9f00ed24" exitCode=0 Dec 05 20:15:48 crc kubenswrapper[4744]: I1205 20:15:48.869353 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fvql" event={"ID":"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4","Type":"ContainerDied","Data":"9c72da9e9f9915bf9bdc0d19779b490a23f097f6ab26740670049faa9f00ed24"} Dec 05 20:15:48 crc kubenswrapper[4744]: I1205 20:15:48.929176 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.011017 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-utilities\") pod \"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4\" (UID: \"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4\") " Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.011079 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-catalog-content\") pod \"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4\" (UID: \"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4\") " Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.011108 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd8v4\" (UniqueName: \"kubernetes.io/projected/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-kube-api-access-bd8v4\") pod \"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4\" (UID: \"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4\") " Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.011923 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-utilities" (OuterVolumeSpecName: "utilities") pod "b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" (UID: "b57bf7af-b1cf-4cd9-b431-db0540c6ffc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.015654 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-kube-api-access-bd8v4" (OuterVolumeSpecName: "kube-api-access-bd8v4") pod "b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" (UID: "b57bf7af-b1cf-4cd9-b431-db0540c6ffc4"). InnerVolumeSpecName "kube-api-access-bd8v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.112632 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.112672 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd8v4\" (UniqueName: \"kubernetes.io/projected/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-kube-api-access-bd8v4\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.129824 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" (UID: "b57bf7af-b1cf-4cd9-b431-db0540c6ffc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.214349 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.875687 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fvql" event={"ID":"b57bf7af-b1cf-4cd9-b431-db0540c6ffc4","Type":"ContainerDied","Data":"769b3fec5fad69db94dd21a88a1ee77cb16728ed701166920dacbe7e37ddaf36"} Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.875748 4744 scope.go:117] "RemoveContainer" containerID="9c72da9e9f9915bf9bdc0d19779b490a23f097f6ab26740670049faa9f00ed24" Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.875761 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fvql" Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.910613 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7fvql"] Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.913002 4744 scope.go:117] "RemoveContainer" containerID="d7076a96891777dfe1b1f3f0dda9bfd096aac6098387aa30bccdef1f1ef3ba44" Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.915752 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7fvql"] Dec 05 20:15:49 crc kubenswrapper[4744]: I1205 20:15:49.934488 4744 scope.go:117] "RemoveContainer" containerID="9b77022da73ab10f9c3be2e48daf1f31928f40d36db4ef4defc311f4f2296f49" Dec 05 20:15:50 crc kubenswrapper[4744]: I1205 20:15:50.091016 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" path="/var/lib/kubelet/pods/b57bf7af-b1cf-4cd9-b431-db0540c6ffc4/volumes" Dec 05 20:15:54 crc kubenswrapper[4744]: I1205 20:15:54.988225 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws"] Dec 05 20:15:54 crc kubenswrapper[4744]: E1205 20:15:54.988827 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" containerName="registry-server" Dec 05 20:15:54 crc kubenswrapper[4744]: I1205 20:15:54.988839 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" containerName="registry-server" Dec 05 20:15:54 crc kubenswrapper[4744]: E1205 20:15:54.988847 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" containerName="extract-utilities" Dec 05 20:15:54 crc kubenswrapper[4744]: I1205 20:15:54.988853 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" containerName="extract-utilities" Dec 05 20:15:54 crc kubenswrapper[4744]: E1205 20:15:54.988862 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" containerName="extract-content" Dec 05 20:15:54 crc kubenswrapper[4744]: I1205 20:15:54.988868 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" containerName="extract-content" Dec 05 20:15:54 crc kubenswrapper[4744]: E1205 20:15:54.988876 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 20:15:54 crc kubenswrapper[4744]: I1205 20:15:54.988882 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 20:15:54 crc kubenswrapper[4744]: I1205 20:15:54.988992 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57bf7af-b1cf-4cd9-b431-db0540c6ffc4" containerName="registry-server" Dec 05 20:15:54 crc kubenswrapper[4744]: I1205 20:15:54.989007 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 20:15:54 crc kubenswrapper[4744]: I1205 20:15:54.989383 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" Dec 05 20:15:54 crc kubenswrapper[4744]: I1205 20:15:54.993163 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 20:15:54 crc kubenswrapper[4744]: I1205 20:15:54.993470 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.006865 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws"] Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.041533 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c"] Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.041719 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" podUID="53f9a23a-b663-4cbf-8c34-334f073e3092" containerName="route-controller-manager" containerID="cri-o://3061cab75472f9e338ec46f53fdc8cb9f67ecf93b423e88b4bbb16f3a36fa105" gracePeriod=30 Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.095525 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljjl2\" (UniqueName: \"kubernetes.io/projected/f0033b27-50c8-4ea2-afcc-ac3eac27171f-kube-api-access-ljjl2\") pod \"collect-profiles-29416095-5xwws\" (UID: \"f0033b27-50c8-4ea2-afcc-ac3eac27171f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.095885 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0033b27-50c8-4ea2-afcc-ac3eac27171f-config-volume\") pod \"collect-profiles-29416095-5xwws\" (UID: \"f0033b27-50c8-4ea2-afcc-ac3eac27171f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.095923 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f0033b27-50c8-4ea2-afcc-ac3eac27171f-secret-volume\") pod \"collect-profiles-29416095-5xwws\" (UID: \"f0033b27-50c8-4ea2-afcc-ac3eac27171f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.128112 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ccqxf"] Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.128320 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" podUID="9429a50e-b1ff-480d-b8af-d0f095f8cd86" containerName="controller-manager" containerID="cri-o://c541f34d10c5cf65375ed3854441a4ea31bd298c4d4da83e7ee6bc251aa2e003" gracePeriod=30 Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.197136 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljjl2\" (UniqueName: \"kubernetes.io/projected/f0033b27-50c8-4ea2-afcc-ac3eac27171f-kube-api-access-ljjl2\") pod \"collect-profiles-29416095-5xwws\" (UID: \"f0033b27-50c8-4ea2-afcc-ac3eac27171f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.197207 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0033b27-50c8-4ea2-afcc-ac3eac27171f-config-volume\") pod \"collect-profiles-29416095-5xwws\" (UID: \"f0033b27-50c8-4ea2-afcc-ac3eac27171f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.197239 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f0033b27-50c8-4ea2-afcc-ac3eac27171f-secret-volume\") pod \"collect-profiles-29416095-5xwws\" (UID: \"f0033b27-50c8-4ea2-afcc-ac3eac27171f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.198180 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0033b27-50c8-4ea2-afcc-ac3eac27171f-config-volume\") pod \"collect-profiles-29416095-5xwws\" (UID: \"f0033b27-50c8-4ea2-afcc-ac3eac27171f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.205087 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f0033b27-50c8-4ea2-afcc-ac3eac27171f-secret-volume\") pod \"collect-profiles-29416095-5xwws\" (UID: \"f0033b27-50c8-4ea2-afcc-ac3eac27171f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.221946 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljjl2\" (UniqueName: \"kubernetes.io/projected/f0033b27-50c8-4ea2-afcc-ac3eac27171f-kube-api-access-ljjl2\") pod \"collect-profiles-29416095-5xwws\" (UID: \"f0033b27-50c8-4ea2-afcc-ac3eac27171f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.305255 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.414960 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.500367 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53f9a23a-b663-4cbf-8c34-334f073e3092-config\") pod \"53f9a23a-b663-4cbf-8c34-334f073e3092\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.500453 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53f9a23a-b663-4cbf-8c34-334f073e3092-serving-cert\") pod \"53f9a23a-b663-4cbf-8c34-334f073e3092\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.500558 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj295\" (UniqueName: \"kubernetes.io/projected/53f9a23a-b663-4cbf-8c34-334f073e3092-kube-api-access-cj295\") pod \"53f9a23a-b663-4cbf-8c34-334f073e3092\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.500585 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53f9a23a-b663-4cbf-8c34-334f073e3092-client-ca\") pod \"53f9a23a-b663-4cbf-8c34-334f073e3092\" (UID: \"53f9a23a-b663-4cbf-8c34-334f073e3092\") " Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.501227 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53f9a23a-b663-4cbf-8c34-334f073e3092-config" (OuterVolumeSpecName: "config") pod "53f9a23a-b663-4cbf-8c34-334f073e3092" (UID: "53f9a23a-b663-4cbf-8c34-334f073e3092"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.501456 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53f9a23a-b663-4cbf-8c34-334f073e3092-client-ca" (OuterVolumeSpecName: "client-ca") pod "53f9a23a-b663-4cbf-8c34-334f073e3092" (UID: "53f9a23a-b663-4cbf-8c34-334f073e3092"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.506024 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f9a23a-b663-4cbf-8c34-334f073e3092-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "53f9a23a-b663-4cbf-8c34-334f073e3092" (UID: "53f9a23a-b663-4cbf-8c34-334f073e3092"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.508136 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53f9a23a-b663-4cbf-8c34-334f073e3092-kube-api-access-cj295" (OuterVolumeSpecName: "kube-api-access-cj295") pod "53f9a23a-b663-4cbf-8c34-334f073e3092" (UID: "53f9a23a-b663-4cbf-8c34-334f073e3092"). InnerVolumeSpecName "kube-api-access-cj295". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.519883 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.538162 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws"] Dec 05 20:15:55 crc kubenswrapper[4744]: W1205 20:15:55.540157 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0033b27_50c8_4ea2_afcc_ac3eac27171f.slice/crio-e7ff34b0d9da18ee4e49e27afb8d4fc0abcf0313b1e0e8447980ebac289ee72d WatchSource:0}: Error finding container e7ff34b0d9da18ee4e49e27afb8d4fc0abcf0313b1e0e8447980ebac289ee72d: Status 404 returned error can't find the container with id e7ff34b0d9da18ee4e49e27afb8d4fc0abcf0313b1e0e8447980ebac289ee72d Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.601567 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-client-ca\") pod \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.601614 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h44h4\" (UniqueName: \"kubernetes.io/projected/9429a50e-b1ff-480d-b8af-d0f095f8cd86-kube-api-access-h44h4\") pod \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.601669 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9429a50e-b1ff-480d-b8af-d0f095f8cd86-serving-cert\") pod \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.601716 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-proxy-ca-bundles\") pod \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.601755 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-config\") pod \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\" (UID: \"9429a50e-b1ff-480d-b8af-d0f095f8cd86\") " Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.601952 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53f9a23a-b663-4cbf-8c34-334f073e3092-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.601974 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53f9a23a-b663-4cbf-8c34-334f073e3092-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.601984 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj295\" (UniqueName: \"kubernetes.io/projected/53f9a23a-b663-4cbf-8c34-334f073e3092-kube-api-access-cj295\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.601997 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53f9a23a-b663-4cbf-8c34-334f073e3092-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.602575 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9429a50e-b1ff-480d-b8af-d0f095f8cd86" (UID: "9429a50e-b1ff-480d-b8af-d0f095f8cd86"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.602618 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-client-ca" (OuterVolumeSpecName: "client-ca") pod "9429a50e-b1ff-480d-b8af-d0f095f8cd86" (UID: "9429a50e-b1ff-480d-b8af-d0f095f8cd86"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.602724 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-config" (OuterVolumeSpecName: "config") pod "9429a50e-b1ff-480d-b8af-d0f095f8cd86" (UID: "9429a50e-b1ff-480d-b8af-d0f095f8cd86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.605646 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9429a50e-b1ff-480d-b8af-d0f095f8cd86-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9429a50e-b1ff-480d-b8af-d0f095f8cd86" (UID: "9429a50e-b1ff-480d-b8af-d0f095f8cd86"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.605677 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9429a50e-b1ff-480d-b8af-d0f095f8cd86-kube-api-access-h44h4" (OuterVolumeSpecName: "kube-api-access-h44h4") pod "9429a50e-b1ff-480d-b8af-d0f095f8cd86" (UID: "9429a50e-b1ff-480d-b8af-d0f095f8cd86"). InnerVolumeSpecName "kube-api-access-h44h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.703302 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9429a50e-b1ff-480d-b8af-d0f095f8cd86-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.703362 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.703375 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.703384 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9429a50e-b1ff-480d-b8af-d0f095f8cd86-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.703393 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h44h4\" (UniqueName: \"kubernetes.io/projected/9429a50e-b1ff-480d-b8af-d0f095f8cd86-kube-api-access-h44h4\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.913830 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" event={"ID":"f0033b27-50c8-4ea2-afcc-ac3eac27171f","Type":"ContainerStarted","Data":"e7ff34b0d9da18ee4e49e27afb8d4fc0abcf0313b1e0e8447980ebac289ee72d"} Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.915733 4744 generic.go:334] "Generic (PLEG): container finished" podID="53f9a23a-b663-4cbf-8c34-334f073e3092" containerID="3061cab75472f9e338ec46f53fdc8cb9f67ecf93b423e88b4bbb16f3a36fa105" exitCode=0 Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.915782 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.915838 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" event={"ID":"53f9a23a-b663-4cbf-8c34-334f073e3092","Type":"ContainerDied","Data":"3061cab75472f9e338ec46f53fdc8cb9f67ecf93b423e88b4bbb16f3a36fa105"} Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.915885 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c" event={"ID":"53f9a23a-b663-4cbf-8c34-334f073e3092","Type":"ContainerDied","Data":"8610abbc0652d06c8f8a195c1cacf07c292f848d22aa2a687d219f35584d0b21"} Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.915904 4744 scope.go:117] "RemoveContainer" containerID="3061cab75472f9e338ec46f53fdc8cb9f67ecf93b423e88b4bbb16f3a36fa105" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.917965 4744 generic.go:334] "Generic (PLEG): container finished" podID="9429a50e-b1ff-480d-b8af-d0f095f8cd86" containerID="c541f34d10c5cf65375ed3854441a4ea31bd298c4d4da83e7ee6bc251aa2e003" exitCode=0 Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.918054 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.918047 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" event={"ID":"9429a50e-b1ff-480d-b8af-d0f095f8cd86","Type":"ContainerDied","Data":"c541f34d10c5cf65375ed3854441a4ea31bd298c4d4da83e7ee6bc251aa2e003"} Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.919474 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ccqxf" event={"ID":"9429a50e-b1ff-480d-b8af-d0f095f8cd86","Type":"ContainerDied","Data":"52a2ebed82a774207cec77751352b48d4b283fc2a9207d7cbbed035ea7339aa7"} Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.941139 4744 scope.go:117] "RemoveContainer" containerID="3061cab75472f9e338ec46f53fdc8cb9f67ecf93b423e88b4bbb16f3a36fa105" Dec 05 20:15:55 crc kubenswrapper[4744]: E1205 20:15:55.942205 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3061cab75472f9e338ec46f53fdc8cb9f67ecf93b423e88b4bbb16f3a36fa105\": container with ID starting with 3061cab75472f9e338ec46f53fdc8cb9f67ecf93b423e88b4bbb16f3a36fa105 not found: ID does not exist" containerID="3061cab75472f9e338ec46f53fdc8cb9f67ecf93b423e88b4bbb16f3a36fa105" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.942474 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3061cab75472f9e338ec46f53fdc8cb9f67ecf93b423e88b4bbb16f3a36fa105"} err="failed to get container status \"3061cab75472f9e338ec46f53fdc8cb9f67ecf93b423e88b4bbb16f3a36fa105\": rpc error: code = NotFound desc = could not find container \"3061cab75472f9e338ec46f53fdc8cb9f67ecf93b423e88b4bbb16f3a36fa105\": container with ID starting with 3061cab75472f9e338ec46f53fdc8cb9f67ecf93b423e88b4bbb16f3a36fa105 not found: ID does not exist" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.942679 4744 scope.go:117] "RemoveContainer" containerID="c541f34d10c5cf65375ed3854441a4ea31bd298c4d4da83e7ee6bc251aa2e003" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.951691 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c"] Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.957693 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rv9c"] Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.963909 4744 scope.go:117] "RemoveContainer" containerID="c541f34d10c5cf65375ed3854441a4ea31bd298c4d4da83e7ee6bc251aa2e003" Dec 05 20:15:55 crc kubenswrapper[4744]: E1205 20:15:55.964436 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c541f34d10c5cf65375ed3854441a4ea31bd298c4d4da83e7ee6bc251aa2e003\": container with ID starting with c541f34d10c5cf65375ed3854441a4ea31bd298c4d4da83e7ee6bc251aa2e003 not found: ID does not exist" containerID="c541f34d10c5cf65375ed3854441a4ea31bd298c4d4da83e7ee6bc251aa2e003" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.964467 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c541f34d10c5cf65375ed3854441a4ea31bd298c4d4da83e7ee6bc251aa2e003"} err="failed to get container status \"c541f34d10c5cf65375ed3854441a4ea31bd298c4d4da83e7ee6bc251aa2e003\": rpc error: code = NotFound desc = could not find container \"c541f34d10c5cf65375ed3854441a4ea31bd298c4d4da83e7ee6bc251aa2e003\": container with ID starting with c541f34d10c5cf65375ed3854441a4ea31bd298c4d4da83e7ee6bc251aa2e003 not found: ID does not exist" Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.969415 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ccqxf"] Dec 05 20:15:55 crc kubenswrapper[4744]: I1205 20:15:55.974594 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ccqxf"] Dec 05 20:15:56 crc kubenswrapper[4744]: I1205 20:15:56.089987 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53f9a23a-b663-4cbf-8c34-334f073e3092" path="/var/lib/kubelet/pods/53f9a23a-b663-4cbf-8c34-334f073e3092/volumes" Dec 05 20:15:56 crc kubenswrapper[4744]: I1205 20:15:56.090700 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9429a50e-b1ff-480d-b8af-d0f095f8cd86" path="/var/lib/kubelet/pods/9429a50e-b1ff-480d-b8af-d0f095f8cd86/volumes" Dec 05 20:15:56 crc kubenswrapper[4744]: I1205 20:15:56.930005 4744 generic.go:334] "Generic (PLEG): container finished" podID="f0033b27-50c8-4ea2-afcc-ac3eac27171f" containerID="e44215b6ee488a12c0204c9a8be45a68b796b76d1851bc5528756f810c4ad5ac" exitCode=0 Dec 05 20:15:56 crc kubenswrapper[4744]: I1205 20:15:56.930226 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" event={"ID":"f0033b27-50c8-4ea2-afcc-ac3eac27171f","Type":"ContainerDied","Data":"e44215b6ee488a12c0204c9a8be45a68b796b76d1851bc5528756f810c4ad5ac"} Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.064093 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl"] Dec 05 20:15:57 crc kubenswrapper[4744]: E1205 20:15:57.064457 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9429a50e-b1ff-480d-b8af-d0f095f8cd86" containerName="controller-manager" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.064491 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9429a50e-b1ff-480d-b8af-d0f095f8cd86" containerName="controller-manager" Dec 05 20:15:57 crc kubenswrapper[4744]: E1205 20:15:57.064525 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f9a23a-b663-4cbf-8c34-334f073e3092" containerName="route-controller-manager" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.064536 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f9a23a-b663-4cbf-8c34-334f073e3092" containerName="route-controller-manager" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.064680 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="53f9a23a-b663-4cbf-8c34-334f073e3092" containerName="route-controller-manager" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.064714 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9429a50e-b1ff-480d-b8af-d0f095f8cd86" containerName="controller-manager" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.065337 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.068907 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.069326 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.069839 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.070164 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.070802 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.073500 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.077150 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb"] Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.079517 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.080769 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.084979 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.085426 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.085612 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.085749 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.085838 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.089349 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl"] Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.089972 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.097358 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb"] Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.123057 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c6cd033-c72e-4ab9-af74-0e39097dd275-client-ca\") pod \"route-controller-manager-bb69f5dbb-9mksb\" (UID: \"0c6cd033-c72e-4ab9-af74-0e39097dd275\") " pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.123115 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-client-ca\") pod \"controller-manager-7b9c89f94d-gkfwl\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.123148 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9gjc\" (UniqueName: \"kubernetes.io/projected/0c6cd033-c72e-4ab9-af74-0e39097dd275-kube-api-access-x9gjc\") pod \"route-controller-manager-bb69f5dbb-9mksb\" (UID: \"0c6cd033-c72e-4ab9-af74-0e39097dd275\") " pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.123259 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c6cd033-c72e-4ab9-af74-0e39097dd275-serving-cert\") pod \"route-controller-manager-bb69f5dbb-9mksb\" (UID: \"0c6cd033-c72e-4ab9-af74-0e39097dd275\") " pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.123310 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6cd033-c72e-4ab9-af74-0e39097dd275-config\") pod \"route-controller-manager-bb69f5dbb-9mksb\" (UID: \"0c6cd033-c72e-4ab9-af74-0e39097dd275\") " pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.123356 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3de19905-69da-4470-a77e-341a5e71412f-serving-cert\") pod \"controller-manager-7b9c89f94d-gkfwl\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.123378 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-proxy-ca-bundles\") pod \"controller-manager-7b9c89f94d-gkfwl\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.123407 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-config\") pod \"controller-manager-7b9c89f94d-gkfwl\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.123479 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhkp4\" (UniqueName: \"kubernetes.io/projected/3de19905-69da-4470-a77e-341a5e71412f-kube-api-access-fhkp4\") pod \"controller-manager-7b9c89f94d-gkfwl\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.224478 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6cd033-c72e-4ab9-af74-0e39097dd275-config\") pod \"route-controller-manager-bb69f5dbb-9mksb\" (UID: \"0c6cd033-c72e-4ab9-af74-0e39097dd275\") " pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.224559 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3de19905-69da-4470-a77e-341a5e71412f-serving-cert\") pod \"controller-manager-7b9c89f94d-gkfwl\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.224597 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-proxy-ca-bundles\") pod \"controller-manager-7b9c89f94d-gkfwl\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.224636 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-config\") pod \"controller-manager-7b9c89f94d-gkfwl\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.224681 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhkp4\" (UniqueName: \"kubernetes.io/projected/3de19905-69da-4470-a77e-341a5e71412f-kube-api-access-fhkp4\") pod \"controller-manager-7b9c89f94d-gkfwl\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.224739 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c6cd033-c72e-4ab9-af74-0e39097dd275-client-ca\") pod \"route-controller-manager-bb69f5dbb-9mksb\" (UID: \"0c6cd033-c72e-4ab9-af74-0e39097dd275\") " pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.224769 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-client-ca\") pod \"controller-manager-7b9c89f94d-gkfwl\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.224814 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9gjc\" (UniqueName: \"kubernetes.io/projected/0c6cd033-c72e-4ab9-af74-0e39097dd275-kube-api-access-x9gjc\") pod \"route-controller-manager-bb69f5dbb-9mksb\" (UID: \"0c6cd033-c72e-4ab9-af74-0e39097dd275\") " pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.224865 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c6cd033-c72e-4ab9-af74-0e39097dd275-serving-cert\") pod \"route-controller-manager-bb69f5dbb-9mksb\" (UID: \"0c6cd033-c72e-4ab9-af74-0e39097dd275\") " pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.231444 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c6cd033-c72e-4ab9-af74-0e39097dd275-serving-cert\") pod \"route-controller-manager-bb69f5dbb-9mksb\" (UID: \"0c6cd033-c72e-4ab9-af74-0e39097dd275\") " pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.232481 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-client-ca\") pod \"controller-manager-7b9c89f94d-gkfwl\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.232481 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c6cd033-c72e-4ab9-af74-0e39097dd275-client-ca\") pod \"route-controller-manager-bb69f5dbb-9mksb\" (UID: \"0c6cd033-c72e-4ab9-af74-0e39097dd275\") " pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.232932 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6cd033-c72e-4ab9-af74-0e39097dd275-config\") pod \"route-controller-manager-bb69f5dbb-9mksb\" (UID: \"0c6cd033-c72e-4ab9-af74-0e39097dd275\") " pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.233444 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-proxy-ca-bundles\") pod \"controller-manager-7b9c89f94d-gkfwl\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.233902 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-config\") pod \"controller-manager-7b9c89f94d-gkfwl\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.235510 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3de19905-69da-4470-a77e-341a5e71412f-serving-cert\") pod \"controller-manager-7b9c89f94d-gkfwl\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.256544 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhkp4\" (UniqueName: \"kubernetes.io/projected/3de19905-69da-4470-a77e-341a5e71412f-kube-api-access-fhkp4\") pod \"controller-manager-7b9c89f94d-gkfwl\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.258815 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9gjc\" (UniqueName: \"kubernetes.io/projected/0c6cd033-c72e-4ab9-af74-0e39097dd275-kube-api-access-x9gjc\") pod \"route-controller-manager-bb69f5dbb-9mksb\" (UID: \"0c6cd033-c72e-4ab9-af74-0e39097dd275\") " pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.397352 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.413630 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.756427 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl"] Dec 05 20:15:57 crc kubenswrapper[4744]: W1205 20:15:57.766683 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3de19905_69da_4470_a77e_341a5e71412f.slice/crio-2b19c7092d33d1f2f6b76c177f21ed5eec8111b0ed183935f91d8893465c06a2 WatchSource:0}: Error finding container 2b19c7092d33d1f2f6b76c177f21ed5eec8111b0ed183935f91d8893465c06a2: Status 404 returned error can't find the container with id 2b19c7092d33d1f2f6b76c177f21ed5eec8111b0ed183935f91d8893465c06a2 Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.781493 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb"] Dec 05 20:15:57 crc kubenswrapper[4744]: W1205 20:15:57.791225 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c6cd033_c72e_4ab9_af74_0e39097dd275.slice/crio-50f8168ea40c9b1a5060288c69a31a2dea4dea09c07bb1c694e3b2440e94158f WatchSource:0}: Error finding container 50f8168ea40c9b1a5060288c69a31a2dea4dea09c07bb1c694e3b2440e94158f: Status 404 returned error can't find the container with id 50f8168ea40c9b1a5060288c69a31a2dea4dea09c07bb1c694e3b2440e94158f Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.937064 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" event={"ID":"0c6cd033-c72e-4ab9-af74-0e39097dd275","Type":"ContainerStarted","Data":"ed65e79de29e818b3df97eefc962a6537c82ca4313641be62918756ee9c584da"} Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.937106 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" event={"ID":"0c6cd033-c72e-4ab9-af74-0e39097dd275","Type":"ContainerStarted","Data":"50f8168ea40c9b1a5060288c69a31a2dea4dea09c07bb1c694e3b2440e94158f"} Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.937340 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.938238 4744 patch_prober.go:28] interesting pod/route-controller-manager-bb69f5dbb-9mksb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.938307 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" podUID="0c6cd033-c72e-4ab9-af74-0e39097dd275" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.940125 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" event={"ID":"3de19905-69da-4470-a77e-341a5e71412f","Type":"ContainerStarted","Data":"d3303cf6c938ba77cfa59df7fd22ef243b9ec76d38a7f2d989dbf604276b55b5"} Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.940163 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.940174 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" event={"ID":"3de19905-69da-4470-a77e-341a5e71412f","Type":"ContainerStarted","Data":"2b19c7092d33d1f2f6b76c177f21ed5eec8111b0ed183935f91d8893465c06a2"} Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.941639 4744 patch_prober.go:28] interesting pod/controller-manager-7b9c89f94d-gkfwl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.941676 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" podUID="3de19905-69da-4470-a77e-341a5e71412f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Dec 05 20:15:57 crc kubenswrapper[4744]: I1205 20:15:57.953933 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" podStartSLOduration=2.953914543 podStartE2EDuration="2.953914543s" podCreationTimestamp="2025-12-05 20:15:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:15:57.953010369 +0000 UTC m=+328.182821747" watchObservedRunningTime="2025-12-05 20:15:57.953914543 +0000 UTC m=+328.183725911" Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.173770 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.185535 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" podStartSLOduration=3.185515183 podStartE2EDuration="3.185515183s" podCreationTimestamp="2025-12-05 20:15:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:15:57.977065044 +0000 UTC m=+328.206876422" watchObservedRunningTime="2025-12-05 20:15:58.185515183 +0000 UTC m=+328.415326551" Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.249609 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f0033b27-50c8-4ea2-afcc-ac3eac27171f-secret-volume\") pod \"f0033b27-50c8-4ea2-afcc-ac3eac27171f\" (UID: \"f0033b27-50c8-4ea2-afcc-ac3eac27171f\") " Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.249687 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0033b27-50c8-4ea2-afcc-ac3eac27171f-config-volume\") pod \"f0033b27-50c8-4ea2-afcc-ac3eac27171f\" (UID: \"f0033b27-50c8-4ea2-afcc-ac3eac27171f\") " Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.249781 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljjl2\" (UniqueName: \"kubernetes.io/projected/f0033b27-50c8-4ea2-afcc-ac3eac27171f-kube-api-access-ljjl2\") pod \"f0033b27-50c8-4ea2-afcc-ac3eac27171f\" (UID: \"f0033b27-50c8-4ea2-afcc-ac3eac27171f\") " Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.250593 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0033b27-50c8-4ea2-afcc-ac3eac27171f-config-volume" (OuterVolumeSpecName: "config-volume") pod "f0033b27-50c8-4ea2-afcc-ac3eac27171f" (UID: "f0033b27-50c8-4ea2-afcc-ac3eac27171f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.256027 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0033b27-50c8-4ea2-afcc-ac3eac27171f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f0033b27-50c8-4ea2-afcc-ac3eac27171f" (UID: "f0033b27-50c8-4ea2-afcc-ac3eac27171f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.257272 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0033b27-50c8-4ea2-afcc-ac3eac27171f-kube-api-access-ljjl2" (OuterVolumeSpecName: "kube-api-access-ljjl2") pod "f0033b27-50c8-4ea2-afcc-ac3eac27171f" (UID: "f0033b27-50c8-4ea2-afcc-ac3eac27171f"). InnerVolumeSpecName "kube-api-access-ljjl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.351255 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljjl2\" (UniqueName: \"kubernetes.io/projected/f0033b27-50c8-4ea2-afcc-ac3eac27171f-kube-api-access-ljjl2\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.351285 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f0033b27-50c8-4ea2-afcc-ac3eac27171f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.351307 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0033b27-50c8-4ea2-afcc-ac3eac27171f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.948673 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" event={"ID":"f0033b27-50c8-4ea2-afcc-ac3eac27171f","Type":"ContainerDied","Data":"e7ff34b0d9da18ee4e49e27afb8d4fc0abcf0313b1e0e8447980ebac289ee72d"} Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.948974 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7ff34b0d9da18ee4e49e27afb8d4fc0abcf0313b1e0e8447980ebac289ee72d" Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.948730 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416095-5xwws" Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.954225 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:15:58 crc kubenswrapper[4744]: I1205 20:15:58.955823 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-bb69f5dbb-9mksb" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.045679 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cqg2r"] Dec 05 20:16:02 crc kubenswrapper[4744]: E1205 20:16:02.049118 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0033b27-50c8-4ea2-afcc-ac3eac27171f" containerName="collect-profiles" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.049528 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0033b27-50c8-4ea2-afcc-ac3eac27171f" containerName="collect-profiles" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.049777 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0033b27-50c8-4ea2-afcc-ac3eac27171f" containerName="collect-profiles" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.050417 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.070466 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cqg2r"] Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.204719 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.204768 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvw7r\" (UniqueName: \"kubernetes.io/projected/c3098268-0558-4e54-966c-6f2ab46443b6-kube-api-access-xvw7r\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.204800 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c3098268-0558-4e54-966c-6f2ab46443b6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.204822 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c3098268-0558-4e54-966c-6f2ab46443b6-registry-certificates\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.204841 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c3098268-0558-4e54-966c-6f2ab46443b6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.204864 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3098268-0558-4e54-966c-6f2ab46443b6-trusted-ca\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.204899 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3098268-0558-4e54-966c-6f2ab46443b6-bound-sa-token\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.204917 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3098268-0558-4e54-966c-6f2ab46443b6-registry-tls\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.226151 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.306507 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3098268-0558-4e54-966c-6f2ab46443b6-bound-sa-token\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.306551 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3098268-0558-4e54-966c-6f2ab46443b6-registry-tls\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.306585 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvw7r\" (UniqueName: \"kubernetes.io/projected/c3098268-0558-4e54-966c-6f2ab46443b6-kube-api-access-xvw7r\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.306618 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c3098268-0558-4e54-966c-6f2ab46443b6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.306637 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c3098268-0558-4e54-966c-6f2ab46443b6-registry-certificates\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.306655 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c3098268-0558-4e54-966c-6f2ab46443b6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.306678 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3098268-0558-4e54-966c-6f2ab46443b6-trusted-ca\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.308063 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c3098268-0558-4e54-966c-6f2ab46443b6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.308625 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c3098268-0558-4e54-966c-6f2ab46443b6-registry-certificates\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.308809 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3098268-0558-4e54-966c-6f2ab46443b6-trusted-ca\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.313003 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c3098268-0558-4e54-966c-6f2ab46443b6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.313434 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3098268-0558-4e54-966c-6f2ab46443b6-registry-tls\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.347986 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3098268-0558-4e54-966c-6f2ab46443b6-bound-sa-token\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.348127 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvw7r\" (UniqueName: \"kubernetes.io/projected/c3098268-0558-4e54-966c-6f2ab46443b6-kube-api-access-xvw7r\") pod \"image-registry-66df7c8f76-cqg2r\" (UID: \"c3098268-0558-4e54-966c-6f2ab46443b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.375477 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.817721 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cqg2r"] Dec 05 20:16:02 crc kubenswrapper[4744]: W1205 20:16:02.828071 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3098268_0558_4e54_966c_6f2ab46443b6.slice/crio-fb2186adf200346e360b4436e152ba53fdf2627856ed8aca07b10ebbe2a5ff5f WatchSource:0}: Error finding container fb2186adf200346e360b4436e152ba53fdf2627856ed8aca07b10ebbe2a5ff5f: Status 404 returned error can't find the container with id fb2186adf200346e360b4436e152ba53fdf2627856ed8aca07b10ebbe2a5ff5f Dec 05 20:16:02 crc kubenswrapper[4744]: I1205 20:16:02.978126 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" event={"ID":"c3098268-0558-4e54-966c-6f2ab46443b6","Type":"ContainerStarted","Data":"fb2186adf200346e360b4436e152ba53fdf2627856ed8aca07b10ebbe2a5ff5f"} Dec 05 20:16:03 crc kubenswrapper[4744]: I1205 20:16:03.986207 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" event={"ID":"c3098268-0558-4e54-966c-6f2ab46443b6","Type":"ContainerStarted","Data":"7f38006642c1bb9801e0816a9a28d5a9b0b34324e53c552a2bc31faeaa298f28"} Dec 05 20:16:03 crc kubenswrapper[4744]: I1205 20:16:03.988954 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:04 crc kubenswrapper[4744]: I1205 20:16:04.013853 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" podStartSLOduration=2.013639549 podStartE2EDuration="2.013639549s" podCreationTimestamp="2025-12-05 20:16:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:16:04.010674801 +0000 UTC m=+334.240486169" watchObservedRunningTime="2025-12-05 20:16:04.013639549 +0000 UTC m=+334.243450927" Dec 05 20:16:22 crc kubenswrapper[4744]: I1205 20:16:22.385560 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-cqg2r" Dec 05 20:16:22 crc kubenswrapper[4744]: I1205 20:16:22.448068 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-628ml"] Dec 05 20:16:33 crc kubenswrapper[4744]: I1205 20:16:33.852429 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl"] Dec 05 20:16:33 crc kubenswrapper[4744]: I1205 20:16:33.853426 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" podUID="3de19905-69da-4470-a77e-341a5e71412f" containerName="controller-manager" containerID="cri-o://d3303cf6c938ba77cfa59df7fd22ef243b9ec76d38a7f2d989dbf604276b55b5" gracePeriod=30 Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.164836 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9sq9s"] Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.165511 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9sq9s" podUID="f76f1c47-c74d-46cb-ad16-db7392a47a9b" containerName="registry-server" containerID="cri-o://2b77e22a6982778d23d7eae56b7085a1c12dfd134a958caa64c474628e19d0c5" gracePeriod=30 Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.171470 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5x6x6"] Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.171718 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5x6x6" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" containerName="registry-server" containerID="cri-o://0cdd43d34cccaf99eea0536c9c5230dcc3370d69fee7816d9e404f2611f48af1" gracePeriod=30 Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.184575 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tr74j"] Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.184785 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" podUID="8bfdca92-a782-4806-a2c0-e54302fd24a4" containerName="marketplace-operator" containerID="cri-o://d31e745c91e3262ef9a281821a591e85878133c5354b36cf9d09d82c8aff0123" gracePeriod=30 Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.198121 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tclr2"] Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.199214 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tclr2" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" containerName="registry-server" containerID="cri-o://4ffdd88e145c16831c1c98719ec8431b8edab4533405566124b3ee38817b828d" gracePeriod=30 Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.211436 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dd7w8"] Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.211705 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dd7w8" podUID="649cba80-0f59-449e-8a48-fbb1b4d373e3" containerName="registry-server" containerID="cri-o://bf505c016b38f1c6e5b022f3ba3ec5026e1ef6222a697c831562b1819dc71519" gracePeriod=30 Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.217897 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p727x"] Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.218770 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p727x" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.229238 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p727x"] Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.246867 4744 generic.go:334] "Generic (PLEG): container finished" podID="3de19905-69da-4470-a77e-341a5e71412f" containerID="d3303cf6c938ba77cfa59df7fd22ef243b9ec76d38a7f2d989dbf604276b55b5" exitCode=0 Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.246906 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" event={"ID":"3de19905-69da-4470-a77e-341a5e71412f","Type":"ContainerDied","Data":"d3303cf6c938ba77cfa59df7fd22ef243b9ec76d38a7f2d989dbf604276b55b5"} Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.246936 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" event={"ID":"3de19905-69da-4470-a77e-341a5e71412f","Type":"ContainerDied","Data":"2b19c7092d33d1f2f6b76c177f21ed5eec8111b0ed183935f91d8893465c06a2"} Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.246946 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b19c7092d33d1f2f6b76c177f21ed5eec8111b0ed183935f91d8893465c06a2" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.322841 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f1fd1d53-3fde-4ef7-be02-f689ce95885b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p727x\" (UID: \"f1fd1d53-3fde-4ef7-be02-f689ce95885b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p727x" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.322923 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1fd1d53-3fde-4ef7-be02-f689ce95885b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p727x\" (UID: \"f1fd1d53-3fde-4ef7-be02-f689ce95885b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p727x" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.322945 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srnwj\" (UniqueName: \"kubernetes.io/projected/f1fd1d53-3fde-4ef7-be02-f689ce95885b-kube-api-access-srnwj\") pod \"marketplace-operator-79b997595-p727x\" (UID: \"f1fd1d53-3fde-4ef7-be02-f689ce95885b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p727x" Dec 05 20:16:34 crc kubenswrapper[4744]: E1205 20:16:34.376953 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2db367c1_8f1b_4096_9f23_5a3d14d3980f.slice/crio-0cdd43d34cccaf99eea0536c9c5230dcc3370d69fee7816d9e404f2611f48af1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2db367c1_8f1b_4096_9f23_5a3d14d3980f.slice/crio-conmon-0cdd43d34cccaf99eea0536c9c5230dcc3370d69fee7816d9e404f2611f48af1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod649cba80_0f59_449e_8a48_fbb1b4d373e3.slice/crio-bf505c016b38f1c6e5b022f3ba3ec5026e1ef6222a697c831562b1819dc71519.scope\": RecentStats: unable to find data in memory cache]" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.424424 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srnwj\" (UniqueName: \"kubernetes.io/projected/f1fd1d53-3fde-4ef7-be02-f689ce95885b-kube-api-access-srnwj\") pod \"marketplace-operator-79b997595-p727x\" (UID: \"f1fd1d53-3fde-4ef7-be02-f689ce95885b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p727x" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.424512 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f1fd1d53-3fde-4ef7-be02-f689ce95885b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p727x\" (UID: \"f1fd1d53-3fde-4ef7-be02-f689ce95885b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p727x" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.424591 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1fd1d53-3fde-4ef7-be02-f689ce95885b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p727x\" (UID: \"f1fd1d53-3fde-4ef7-be02-f689ce95885b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p727x" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.425967 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1fd1d53-3fde-4ef7-be02-f689ce95885b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p727x\" (UID: \"f1fd1d53-3fde-4ef7-be02-f689ce95885b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p727x" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.434818 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f1fd1d53-3fde-4ef7-be02-f689ce95885b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p727x\" (UID: \"f1fd1d53-3fde-4ef7-be02-f689ce95885b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p727x" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.442795 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srnwj\" (UniqueName: \"kubernetes.io/projected/f1fd1d53-3fde-4ef7-be02-f689ce95885b-kube-api-access-srnwj\") pod \"marketplace-operator-79b997595-p727x\" (UID: \"f1fd1d53-3fde-4ef7-be02-f689ce95885b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p727x" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.576240 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p727x" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.590243 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.601527 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.614210 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.639949 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.643049 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.652073 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733097 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3de19905-69da-4470-a77e-341a5e71412f-serving-cert\") pod \"3de19905-69da-4470-a77e-341a5e71412f\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733148 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsszc\" (UniqueName: \"kubernetes.io/projected/8bfdca92-a782-4806-a2c0-e54302fd24a4-kube-api-access-bsszc\") pod \"8bfdca92-a782-4806-a2c0-e54302fd24a4\" (UID: \"8bfdca92-a782-4806-a2c0-e54302fd24a4\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733175 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-catalog-content\") pod \"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8\" (UID: \"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733199 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-client-ca\") pod \"3de19905-69da-4470-a77e-341a5e71412f\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733221 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76f1c47-c74d-46cb-ad16-db7392a47a9b-catalog-content\") pod \"f76f1c47-c74d-46cb-ad16-db7392a47a9b\" (UID: \"f76f1c47-c74d-46cb-ad16-db7392a47a9b\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733242 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dvtx\" (UniqueName: \"kubernetes.io/projected/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-kube-api-access-5dvtx\") pod \"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8\" (UID: \"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733282 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-config\") pod \"3de19905-69da-4470-a77e-341a5e71412f\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733319 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8bfdca92-a782-4806-a2c0-e54302fd24a4-marketplace-operator-metrics\") pod \"8bfdca92-a782-4806-a2c0-e54302fd24a4\" (UID: \"8bfdca92-a782-4806-a2c0-e54302fd24a4\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733345 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76f1c47-c74d-46cb-ad16-db7392a47a9b-utilities\") pod \"f76f1c47-c74d-46cb-ad16-db7392a47a9b\" (UID: \"f76f1c47-c74d-46cb-ad16-db7392a47a9b\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733363 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-utilities\") pod \"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8\" (UID: \"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733381 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhkp4\" (UniqueName: \"kubernetes.io/projected/3de19905-69da-4470-a77e-341a5e71412f-kube-api-access-fhkp4\") pod \"3de19905-69da-4470-a77e-341a5e71412f\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733402 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-849np\" (UniqueName: \"kubernetes.io/projected/2db367c1-8f1b-4096-9f23-5a3d14d3980f-kube-api-access-849np\") pod \"2db367c1-8f1b-4096-9f23-5a3d14d3980f\" (UID: \"2db367c1-8f1b-4096-9f23-5a3d14d3980f\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733433 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-proxy-ca-bundles\") pod \"3de19905-69da-4470-a77e-341a5e71412f\" (UID: \"3de19905-69da-4470-a77e-341a5e71412f\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733454 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db367c1-8f1b-4096-9f23-5a3d14d3980f-utilities\") pod \"2db367c1-8f1b-4096-9f23-5a3d14d3980f\" (UID: \"2db367c1-8f1b-4096-9f23-5a3d14d3980f\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733474 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bfdca92-a782-4806-a2c0-e54302fd24a4-marketplace-trusted-ca\") pod \"8bfdca92-a782-4806-a2c0-e54302fd24a4\" (UID: \"8bfdca92-a782-4806-a2c0-e54302fd24a4\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733490 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/649cba80-0f59-449e-8a48-fbb1b4d373e3-utilities\") pod \"649cba80-0f59-449e-8a48-fbb1b4d373e3\" (UID: \"649cba80-0f59-449e-8a48-fbb1b4d373e3\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733510 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db367c1-8f1b-4096-9f23-5a3d14d3980f-catalog-content\") pod \"2db367c1-8f1b-4096-9f23-5a3d14d3980f\" (UID: \"2db367c1-8f1b-4096-9f23-5a3d14d3980f\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733527 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/649cba80-0f59-449e-8a48-fbb1b4d373e3-catalog-content\") pod \"649cba80-0f59-449e-8a48-fbb1b4d373e3\" (UID: \"649cba80-0f59-449e-8a48-fbb1b4d373e3\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733555 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7txn\" (UniqueName: \"kubernetes.io/projected/f76f1c47-c74d-46cb-ad16-db7392a47a9b-kube-api-access-w7txn\") pod \"f76f1c47-c74d-46cb-ad16-db7392a47a9b\" (UID: \"f76f1c47-c74d-46cb-ad16-db7392a47a9b\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.733574 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqqpw\" (UniqueName: \"kubernetes.io/projected/649cba80-0f59-449e-8a48-fbb1b4d373e3-kube-api-access-lqqpw\") pod \"649cba80-0f59-449e-8a48-fbb1b4d373e3\" (UID: \"649cba80-0f59-449e-8a48-fbb1b4d373e3\") " Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.734928 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-utilities" (OuterVolumeSpecName: "utilities") pod "f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" (UID: "f07b8700-0120-4aa2-bd07-8a6f06d8dbf8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.735120 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3de19905-69da-4470-a77e-341a5e71412f" (UID: "3de19905-69da-4470-a77e-341a5e71412f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.735147 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-config" (OuterVolumeSpecName: "config") pod "3de19905-69da-4470-a77e-341a5e71412f" (UID: "3de19905-69da-4470-a77e-341a5e71412f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.735530 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-client-ca" (OuterVolumeSpecName: "client-ca") pod "3de19905-69da-4470-a77e-341a5e71412f" (UID: "3de19905-69da-4470-a77e-341a5e71412f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.736463 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f76f1c47-c74d-46cb-ad16-db7392a47a9b-utilities" (OuterVolumeSpecName: "utilities") pod "f76f1c47-c74d-46cb-ad16-db7392a47a9b" (UID: "f76f1c47-c74d-46cb-ad16-db7392a47a9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.736517 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/649cba80-0f59-449e-8a48-fbb1b4d373e3-utilities" (OuterVolumeSpecName: "utilities") pod "649cba80-0f59-449e-8a48-fbb1b4d373e3" (UID: "649cba80-0f59-449e-8a48-fbb1b4d373e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.736773 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db367c1-8f1b-4096-9f23-5a3d14d3980f-utilities" (OuterVolumeSpecName: "utilities") pod "2db367c1-8f1b-4096-9f23-5a3d14d3980f" (UID: "2db367c1-8f1b-4096-9f23-5a3d14d3980f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.737750 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bfdca92-a782-4806-a2c0-e54302fd24a4-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "8bfdca92-a782-4806-a2c0-e54302fd24a4" (UID: "8bfdca92-a782-4806-a2c0-e54302fd24a4"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.739794 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/649cba80-0f59-449e-8a48-fbb1b4d373e3-kube-api-access-lqqpw" (OuterVolumeSpecName: "kube-api-access-lqqpw") pod "649cba80-0f59-449e-8a48-fbb1b4d373e3" (UID: "649cba80-0f59-449e-8a48-fbb1b4d373e3"). InnerVolumeSpecName "kube-api-access-lqqpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.740222 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de19905-69da-4470-a77e-341a5e71412f-kube-api-access-fhkp4" (OuterVolumeSpecName: "kube-api-access-fhkp4") pod "3de19905-69da-4470-a77e-341a5e71412f" (UID: "3de19905-69da-4470-a77e-341a5e71412f"). InnerVolumeSpecName "kube-api-access-fhkp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.740265 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bfdca92-a782-4806-a2c0-e54302fd24a4-kube-api-access-bsszc" (OuterVolumeSpecName: "kube-api-access-bsszc") pod "8bfdca92-a782-4806-a2c0-e54302fd24a4" (UID: "8bfdca92-a782-4806-a2c0-e54302fd24a4"). InnerVolumeSpecName "kube-api-access-bsszc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.741814 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f76f1c47-c74d-46cb-ad16-db7392a47a9b-kube-api-access-w7txn" (OuterVolumeSpecName: "kube-api-access-w7txn") pod "f76f1c47-c74d-46cb-ad16-db7392a47a9b" (UID: "f76f1c47-c74d-46cb-ad16-db7392a47a9b"). InnerVolumeSpecName "kube-api-access-w7txn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.742430 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bfdca92-a782-4806-a2c0-e54302fd24a4-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "8bfdca92-a782-4806-a2c0-e54302fd24a4" (UID: "8bfdca92-a782-4806-a2c0-e54302fd24a4"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.745322 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-kube-api-access-5dvtx" (OuterVolumeSpecName: "kube-api-access-5dvtx") pod "f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" (UID: "f07b8700-0120-4aa2-bd07-8a6f06d8dbf8"). InnerVolumeSpecName "kube-api-access-5dvtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.750951 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db367c1-8f1b-4096-9f23-5a3d14d3980f-kube-api-access-849np" (OuterVolumeSpecName: "kube-api-access-849np") pod "2db367c1-8f1b-4096-9f23-5a3d14d3980f" (UID: "2db367c1-8f1b-4096-9f23-5a3d14d3980f"). InnerVolumeSpecName "kube-api-access-849np". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.754893 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de19905-69da-4470-a77e-341a5e71412f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3de19905-69da-4470-a77e-341a5e71412f" (UID: "3de19905-69da-4470-a77e-341a5e71412f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.761758 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" (UID: "f07b8700-0120-4aa2-bd07-8a6f06d8dbf8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.797259 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db367c1-8f1b-4096-9f23-5a3d14d3980f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2db367c1-8f1b-4096-9f23-5a3d14d3980f" (UID: "2db367c1-8f1b-4096-9f23-5a3d14d3980f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.800850 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f76f1c47-c74d-46cb-ad16-db7392a47a9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f76f1c47-c74d-46cb-ad16-db7392a47a9b" (UID: "f76f1c47-c74d-46cb-ad16-db7392a47a9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.829161 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p727x"] Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834522 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3de19905-69da-4470-a77e-341a5e71412f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834549 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsszc\" (UniqueName: \"kubernetes.io/projected/8bfdca92-a782-4806-a2c0-e54302fd24a4-kube-api-access-bsszc\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834561 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834570 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834579 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76f1c47-c74d-46cb-ad16-db7392a47a9b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834588 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dvtx\" (UniqueName: \"kubernetes.io/projected/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-kube-api-access-5dvtx\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834595 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834604 4744 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8bfdca92-a782-4806-a2c0-e54302fd24a4-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834612 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76f1c47-c74d-46cb-ad16-db7392a47a9b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834620 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834629 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhkp4\" (UniqueName: \"kubernetes.io/projected/3de19905-69da-4470-a77e-341a5e71412f-kube-api-access-fhkp4\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834638 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-849np\" (UniqueName: \"kubernetes.io/projected/2db367c1-8f1b-4096-9f23-5a3d14d3980f-kube-api-access-849np\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834646 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3de19905-69da-4470-a77e-341a5e71412f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834653 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db367c1-8f1b-4096-9f23-5a3d14d3980f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834662 4744 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bfdca92-a782-4806-a2c0-e54302fd24a4-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834670 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/649cba80-0f59-449e-8a48-fbb1b4d373e3-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834678 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db367c1-8f1b-4096-9f23-5a3d14d3980f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834686 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7txn\" (UniqueName: \"kubernetes.io/projected/f76f1c47-c74d-46cb-ad16-db7392a47a9b-kube-api-access-w7txn\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.834695 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqqpw\" (UniqueName: \"kubernetes.io/projected/649cba80-0f59-449e-8a48-fbb1b4d373e3-kube-api-access-lqqpw\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.873730 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/649cba80-0f59-449e-8a48-fbb1b4d373e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "649cba80-0f59-449e-8a48-fbb1b4d373e3" (UID: "649cba80-0f59-449e-8a48-fbb1b4d373e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:16:34 crc kubenswrapper[4744]: I1205 20:16:34.936215 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/649cba80-0f59-449e-8a48-fbb1b4d373e3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102018 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7687c74b75-cjptt"] Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.102198 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" containerName="extract-utilities" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102209 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" containerName="extract-utilities" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.102217 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" containerName="registry-server" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102224 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" containerName="registry-server" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.102240 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76f1c47-c74d-46cb-ad16-db7392a47a9b" containerName="extract-content" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102247 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76f1c47-c74d-46cb-ad16-db7392a47a9b" containerName="extract-content" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.102255 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" containerName="extract-content" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102262 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" containerName="extract-content" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.102271 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de19905-69da-4470-a77e-341a5e71412f" containerName="controller-manager" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102277 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de19905-69da-4470-a77e-341a5e71412f" containerName="controller-manager" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.102285 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bfdca92-a782-4806-a2c0-e54302fd24a4" containerName="marketplace-operator" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102305 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bfdca92-a782-4806-a2c0-e54302fd24a4" containerName="marketplace-operator" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.102313 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" containerName="extract-utilities" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102318 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" containerName="extract-utilities" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.102326 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76f1c47-c74d-46cb-ad16-db7392a47a9b" containerName="registry-server" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102332 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76f1c47-c74d-46cb-ad16-db7392a47a9b" containerName="registry-server" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.102339 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" containerName="extract-content" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102344 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" containerName="extract-content" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.102350 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649cba80-0f59-449e-8a48-fbb1b4d373e3" containerName="registry-server" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102356 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="649cba80-0f59-449e-8a48-fbb1b4d373e3" containerName="registry-server" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.102364 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649cba80-0f59-449e-8a48-fbb1b4d373e3" containerName="extract-utilities" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102369 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="649cba80-0f59-449e-8a48-fbb1b4d373e3" containerName="extract-utilities" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.102376 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76f1c47-c74d-46cb-ad16-db7392a47a9b" containerName="extract-utilities" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102381 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76f1c47-c74d-46cb-ad16-db7392a47a9b" containerName="extract-utilities" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.102389 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649cba80-0f59-449e-8a48-fbb1b4d373e3" containerName="extract-content" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102395 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="649cba80-0f59-449e-8a48-fbb1b4d373e3" containerName="extract-content" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.102403 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" containerName="registry-server" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102409 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" containerName="registry-server" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102499 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="649cba80-0f59-449e-8a48-fbb1b4d373e3" containerName="registry-server" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102509 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bfdca92-a782-4806-a2c0-e54302fd24a4" containerName="marketplace-operator" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102518 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de19905-69da-4470-a77e-341a5e71412f" containerName="controller-manager" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102525 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" containerName="registry-server" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102533 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" containerName="registry-server" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102542 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f76f1c47-c74d-46cb-ad16-db7392a47a9b" containerName="registry-server" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.102874 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.108350 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7687c74b75-cjptt"] Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.239119 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5dc9d81-3024-44d6-b86a-1fca22004385-serving-cert\") pod \"controller-manager-7687c74b75-cjptt\" (UID: \"d5dc9d81-3024-44d6-b86a-1fca22004385\") " pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.239492 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5dc9d81-3024-44d6-b86a-1fca22004385-proxy-ca-bundles\") pod \"controller-manager-7687c74b75-cjptt\" (UID: \"d5dc9d81-3024-44d6-b86a-1fca22004385\") " pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.239566 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl5rk\" (UniqueName: \"kubernetes.io/projected/d5dc9d81-3024-44d6-b86a-1fca22004385-kube-api-access-zl5rk\") pod \"controller-manager-7687c74b75-cjptt\" (UID: \"d5dc9d81-3024-44d6-b86a-1fca22004385\") " pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.239594 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dc9d81-3024-44d6-b86a-1fca22004385-config\") pod \"controller-manager-7687c74b75-cjptt\" (UID: \"d5dc9d81-3024-44d6-b86a-1fca22004385\") " pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.239620 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5dc9d81-3024-44d6-b86a-1fca22004385-client-ca\") pod \"controller-manager-7687c74b75-cjptt\" (UID: \"d5dc9d81-3024-44d6-b86a-1fca22004385\") " pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.253730 4744 generic.go:334] "Generic (PLEG): container finished" podID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" containerID="0cdd43d34cccaf99eea0536c9c5230dcc3370d69fee7816d9e404f2611f48af1" exitCode=0 Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.253765 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x6x6" event={"ID":"2db367c1-8f1b-4096-9f23-5a3d14d3980f","Type":"ContainerDied","Data":"0cdd43d34cccaf99eea0536c9c5230dcc3370d69fee7816d9e404f2611f48af1"} Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.253799 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x6x6" event={"ID":"2db367c1-8f1b-4096-9f23-5a3d14d3980f","Type":"ContainerDied","Data":"3305a533fbc19e835c6f9084ca683a4544a5aa935d68f2ec7d230c76caf1a32a"} Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.253820 4744 scope.go:117] "RemoveContainer" containerID="0cdd43d34cccaf99eea0536c9c5230dcc3370d69fee7816d9e404f2611f48af1" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.253820 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5x6x6" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.255725 4744 generic.go:334] "Generic (PLEG): container finished" podID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" containerID="4ffdd88e145c16831c1c98719ec8431b8edab4533405566124b3ee38817b828d" exitCode=0 Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.255792 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tclr2" event={"ID":"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8","Type":"ContainerDied","Data":"4ffdd88e145c16831c1c98719ec8431b8edab4533405566124b3ee38817b828d"} Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.255822 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tclr2" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.255825 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tclr2" event={"ID":"f07b8700-0120-4aa2-bd07-8a6f06d8dbf8","Type":"ContainerDied","Data":"a8e58c6a9239fb2131c05a950afcdc44b13b62007adb69fbb842e1cdd00360ae"} Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.258172 4744 generic.go:334] "Generic (PLEG): container finished" podID="649cba80-0f59-449e-8a48-fbb1b4d373e3" containerID="bf505c016b38f1c6e5b022f3ba3ec5026e1ef6222a697c831562b1819dc71519" exitCode=0 Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.258228 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd7w8" event={"ID":"649cba80-0f59-449e-8a48-fbb1b4d373e3","Type":"ContainerDied","Data":"bf505c016b38f1c6e5b022f3ba3ec5026e1ef6222a697c831562b1819dc71519"} Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.258254 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd7w8" event={"ID":"649cba80-0f59-449e-8a48-fbb1b4d373e3","Type":"ContainerDied","Data":"8e1d333ec24ef0f19ad98a76a30bb0176804b1cf36e3a133990ddcb25c320b12"} Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.258325 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd7w8" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.261226 4744 generic.go:334] "Generic (PLEG): container finished" podID="8bfdca92-a782-4806-a2c0-e54302fd24a4" containerID="d31e745c91e3262ef9a281821a591e85878133c5354b36cf9d09d82c8aff0123" exitCode=0 Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.261309 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" event={"ID":"8bfdca92-a782-4806-a2c0-e54302fd24a4","Type":"ContainerDied","Data":"d31e745c91e3262ef9a281821a591e85878133c5354b36cf9d09d82c8aff0123"} Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.261335 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" event={"ID":"8bfdca92-a782-4806-a2c0-e54302fd24a4","Type":"ContainerDied","Data":"0ef306a513dc0ee16ea9f8bda71f5f6eefe8a0b4bc2465c1f7246d03221cc1cf"} Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.261391 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tr74j" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.267924 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p727x" event={"ID":"f1fd1d53-3fde-4ef7-be02-f689ce95885b","Type":"ContainerStarted","Data":"72dfc1ae0dbed4384430610c3b779573c5a64ce57254c995a92b7e2eab70c1ab"} Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.267963 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p727x" event={"ID":"f1fd1d53-3fde-4ef7-be02-f689ce95885b","Type":"ContainerStarted","Data":"608dec5bc65c6bbf3dc41a5d3d94af5965703e61bbf2f3377b7a8028bb8fecb1"} Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.268111 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p727x" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.269849 4744 scope.go:117] "RemoveContainer" containerID="d85df125698bde150bbf620acff02e954eb9629fe0f759a7c05c07b36d26cd11" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.271135 4744 generic.go:334] "Generic (PLEG): container finished" podID="f76f1c47-c74d-46cb-ad16-db7392a47a9b" containerID="2b77e22a6982778d23d7eae56b7085a1c12dfd134a958caa64c474628e19d0c5" exitCode=0 Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.271191 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9sq9s" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.271224 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sq9s" event={"ID":"f76f1c47-c74d-46cb-ad16-db7392a47a9b","Type":"ContainerDied","Data":"2b77e22a6982778d23d7eae56b7085a1c12dfd134a958caa64c474628e19d0c5"} Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.271264 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9sq9s" event={"ID":"f76f1c47-c74d-46cb-ad16-db7392a47a9b","Type":"ContainerDied","Data":"c6cd24b6d9af028602bd7a8afb08e21d47a9c986881379099cdf99834623c6b2"} Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.271198 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.280605 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p727x" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.281725 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-p727x" podStartSLOduration=1.281712541 podStartE2EDuration="1.281712541s" podCreationTimestamp="2025-12-05 20:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:16:35.281223548 +0000 UTC m=+365.511034916" watchObservedRunningTime="2025-12-05 20:16:35.281712541 +0000 UTC m=+365.511523909" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.299612 4744 scope.go:117] "RemoveContainer" containerID="74810375b874abee89ee211b3e99035a23fa85923888b08fab4657ee630b156c" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.337755 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5x6x6"] Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.344304 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5x6x6"] Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.344349 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5dc9d81-3024-44d6-b86a-1fca22004385-proxy-ca-bundles\") pod \"controller-manager-7687c74b75-cjptt\" (UID: \"d5dc9d81-3024-44d6-b86a-1fca22004385\") " pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.344436 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl5rk\" (UniqueName: \"kubernetes.io/projected/d5dc9d81-3024-44d6-b86a-1fca22004385-kube-api-access-zl5rk\") pod \"controller-manager-7687c74b75-cjptt\" (UID: \"d5dc9d81-3024-44d6-b86a-1fca22004385\") " pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.344485 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dc9d81-3024-44d6-b86a-1fca22004385-config\") pod \"controller-manager-7687c74b75-cjptt\" (UID: \"d5dc9d81-3024-44d6-b86a-1fca22004385\") " pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.344509 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5dc9d81-3024-44d6-b86a-1fca22004385-client-ca\") pod \"controller-manager-7687c74b75-cjptt\" (UID: \"d5dc9d81-3024-44d6-b86a-1fca22004385\") " pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.344562 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5dc9d81-3024-44d6-b86a-1fca22004385-serving-cert\") pod \"controller-manager-7687c74b75-cjptt\" (UID: \"d5dc9d81-3024-44d6-b86a-1fca22004385\") " pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.345573 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5dc9d81-3024-44d6-b86a-1fca22004385-proxy-ca-bundles\") pod \"controller-manager-7687c74b75-cjptt\" (UID: \"d5dc9d81-3024-44d6-b86a-1fca22004385\") " pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.345831 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5dc9d81-3024-44d6-b86a-1fca22004385-client-ca\") pod \"controller-manager-7687c74b75-cjptt\" (UID: \"d5dc9d81-3024-44d6-b86a-1fca22004385\") " pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.348809 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5dc9d81-3024-44d6-b86a-1fca22004385-serving-cert\") pod \"controller-manager-7687c74b75-cjptt\" (UID: \"d5dc9d81-3024-44d6-b86a-1fca22004385\") " pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.354769 4744 scope.go:117] "RemoveContainer" containerID="0cdd43d34cccaf99eea0536c9c5230dcc3370d69fee7816d9e404f2611f48af1" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.355204 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cdd43d34cccaf99eea0536c9c5230dcc3370d69fee7816d9e404f2611f48af1\": container with ID starting with 0cdd43d34cccaf99eea0536c9c5230dcc3370d69fee7816d9e404f2611f48af1 not found: ID does not exist" containerID="0cdd43d34cccaf99eea0536c9c5230dcc3370d69fee7816d9e404f2611f48af1" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.355237 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cdd43d34cccaf99eea0536c9c5230dcc3370d69fee7816d9e404f2611f48af1"} err="failed to get container status \"0cdd43d34cccaf99eea0536c9c5230dcc3370d69fee7816d9e404f2611f48af1\": rpc error: code = NotFound desc = could not find container \"0cdd43d34cccaf99eea0536c9c5230dcc3370d69fee7816d9e404f2611f48af1\": container with ID starting with 0cdd43d34cccaf99eea0536c9c5230dcc3370d69fee7816d9e404f2611f48af1 not found: ID does not exist" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.355260 4744 scope.go:117] "RemoveContainer" containerID="d85df125698bde150bbf620acff02e954eb9629fe0f759a7c05c07b36d26cd11" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.357856 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tr74j"] Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.358880 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d85df125698bde150bbf620acff02e954eb9629fe0f759a7c05c07b36d26cd11\": container with ID starting with d85df125698bde150bbf620acff02e954eb9629fe0f759a7c05c07b36d26cd11 not found: ID does not exist" containerID="d85df125698bde150bbf620acff02e954eb9629fe0f759a7c05c07b36d26cd11" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.358911 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85df125698bde150bbf620acff02e954eb9629fe0f759a7c05c07b36d26cd11"} err="failed to get container status \"d85df125698bde150bbf620acff02e954eb9629fe0f759a7c05c07b36d26cd11\": rpc error: code = NotFound desc = could not find container \"d85df125698bde150bbf620acff02e954eb9629fe0f759a7c05c07b36d26cd11\": container with ID starting with d85df125698bde150bbf620acff02e954eb9629fe0f759a7c05c07b36d26cd11 not found: ID does not exist" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.358925 4744 scope.go:117] "RemoveContainer" containerID="74810375b874abee89ee211b3e99035a23fa85923888b08fab4657ee630b156c" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.359175 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74810375b874abee89ee211b3e99035a23fa85923888b08fab4657ee630b156c\": container with ID starting with 74810375b874abee89ee211b3e99035a23fa85923888b08fab4657ee630b156c not found: ID does not exist" containerID="74810375b874abee89ee211b3e99035a23fa85923888b08fab4657ee630b156c" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.359194 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74810375b874abee89ee211b3e99035a23fa85923888b08fab4657ee630b156c"} err="failed to get container status \"74810375b874abee89ee211b3e99035a23fa85923888b08fab4657ee630b156c\": rpc error: code = NotFound desc = could not find container \"74810375b874abee89ee211b3e99035a23fa85923888b08fab4657ee630b156c\": container with ID starting with 74810375b874abee89ee211b3e99035a23fa85923888b08fab4657ee630b156c not found: ID does not exist" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.359206 4744 scope.go:117] "RemoveContainer" containerID="4ffdd88e145c16831c1c98719ec8431b8edab4533405566124b3ee38817b828d" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.363839 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dc9d81-3024-44d6-b86a-1fca22004385-config\") pod \"controller-manager-7687c74b75-cjptt\" (UID: \"d5dc9d81-3024-44d6-b86a-1fca22004385\") " pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.365144 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tr74j"] Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.367495 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl5rk\" (UniqueName: \"kubernetes.io/projected/d5dc9d81-3024-44d6-b86a-1fca22004385-kube-api-access-zl5rk\") pod \"controller-manager-7687c74b75-cjptt\" (UID: \"d5dc9d81-3024-44d6-b86a-1fca22004385\") " pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.391046 4744 scope.go:117] "RemoveContainer" containerID="1afa242742fbfd72482f43c0dc0117d1b03043f048947c003c8631bd81e93b15" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.398619 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl"] Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.406388 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b9c89f94d-gkfwl"] Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.409905 4744 scope.go:117] "RemoveContainer" containerID="d9c6600bb5bf76bb8f0503b2e714255940fe1c0ad630b9316239315853a73edf" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.411545 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dd7w8"] Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.415248 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.426794 4744 scope.go:117] "RemoveContainer" containerID="4ffdd88e145c16831c1c98719ec8431b8edab4533405566124b3ee38817b828d" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.427215 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ffdd88e145c16831c1c98719ec8431b8edab4533405566124b3ee38817b828d\": container with ID starting with 4ffdd88e145c16831c1c98719ec8431b8edab4533405566124b3ee38817b828d not found: ID does not exist" containerID="4ffdd88e145c16831c1c98719ec8431b8edab4533405566124b3ee38817b828d" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.427247 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ffdd88e145c16831c1c98719ec8431b8edab4533405566124b3ee38817b828d"} err="failed to get container status \"4ffdd88e145c16831c1c98719ec8431b8edab4533405566124b3ee38817b828d\": rpc error: code = NotFound desc = could not find container \"4ffdd88e145c16831c1c98719ec8431b8edab4533405566124b3ee38817b828d\": container with ID starting with 4ffdd88e145c16831c1c98719ec8431b8edab4533405566124b3ee38817b828d not found: ID does not exist" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.427268 4744 scope.go:117] "RemoveContainer" containerID="1afa242742fbfd72482f43c0dc0117d1b03043f048947c003c8631bd81e93b15" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.427881 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1afa242742fbfd72482f43c0dc0117d1b03043f048947c003c8631bd81e93b15\": container with ID starting with 1afa242742fbfd72482f43c0dc0117d1b03043f048947c003c8631bd81e93b15 not found: ID does not exist" containerID="1afa242742fbfd72482f43c0dc0117d1b03043f048947c003c8631bd81e93b15" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.427936 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1afa242742fbfd72482f43c0dc0117d1b03043f048947c003c8631bd81e93b15"} err="failed to get container status \"1afa242742fbfd72482f43c0dc0117d1b03043f048947c003c8631bd81e93b15\": rpc error: code = NotFound desc = could not find container \"1afa242742fbfd72482f43c0dc0117d1b03043f048947c003c8631bd81e93b15\": container with ID starting with 1afa242742fbfd72482f43c0dc0117d1b03043f048947c003c8631bd81e93b15 not found: ID does not exist" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.427971 4744 scope.go:117] "RemoveContainer" containerID="d9c6600bb5bf76bb8f0503b2e714255940fe1c0ad630b9316239315853a73edf" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.428214 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c6600bb5bf76bb8f0503b2e714255940fe1c0ad630b9316239315853a73edf\": container with ID starting with d9c6600bb5bf76bb8f0503b2e714255940fe1c0ad630b9316239315853a73edf not found: ID does not exist" containerID="d9c6600bb5bf76bb8f0503b2e714255940fe1c0ad630b9316239315853a73edf" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.428234 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c6600bb5bf76bb8f0503b2e714255940fe1c0ad630b9316239315853a73edf"} err="failed to get container status \"d9c6600bb5bf76bb8f0503b2e714255940fe1c0ad630b9316239315853a73edf\": rpc error: code = NotFound desc = could not find container \"d9c6600bb5bf76bb8f0503b2e714255940fe1c0ad630b9316239315853a73edf\": container with ID starting with d9c6600bb5bf76bb8f0503b2e714255940fe1c0ad630b9316239315853a73edf not found: ID does not exist" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.428247 4744 scope.go:117] "RemoveContainer" containerID="bf505c016b38f1c6e5b022f3ba3ec5026e1ef6222a697c831562b1819dc71519" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.429117 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dd7w8"] Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.434395 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tclr2"] Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.439702 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tclr2"] Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.441600 4744 scope.go:117] "RemoveContainer" containerID="76319e80e00bcd06b7a5f6089cf069ca74f9b1faa685462aa8d0dd5e72a306b2" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.444421 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9sq9s"] Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.450556 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9sq9s"] Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.458889 4744 scope.go:117] "RemoveContainer" containerID="f827d7b888ca1031c0186ebd688af3ff5d7821d02f48a0851beb75c12709efb2" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.476903 4744 scope.go:117] "RemoveContainer" containerID="bf505c016b38f1c6e5b022f3ba3ec5026e1ef6222a697c831562b1819dc71519" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.477243 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf505c016b38f1c6e5b022f3ba3ec5026e1ef6222a697c831562b1819dc71519\": container with ID starting with bf505c016b38f1c6e5b022f3ba3ec5026e1ef6222a697c831562b1819dc71519 not found: ID does not exist" containerID="bf505c016b38f1c6e5b022f3ba3ec5026e1ef6222a697c831562b1819dc71519" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.477271 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf505c016b38f1c6e5b022f3ba3ec5026e1ef6222a697c831562b1819dc71519"} err="failed to get container status \"bf505c016b38f1c6e5b022f3ba3ec5026e1ef6222a697c831562b1819dc71519\": rpc error: code = NotFound desc = could not find container \"bf505c016b38f1c6e5b022f3ba3ec5026e1ef6222a697c831562b1819dc71519\": container with ID starting with bf505c016b38f1c6e5b022f3ba3ec5026e1ef6222a697c831562b1819dc71519 not found: ID does not exist" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.477306 4744 scope.go:117] "RemoveContainer" containerID="76319e80e00bcd06b7a5f6089cf069ca74f9b1faa685462aa8d0dd5e72a306b2" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.477484 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76319e80e00bcd06b7a5f6089cf069ca74f9b1faa685462aa8d0dd5e72a306b2\": container with ID starting with 76319e80e00bcd06b7a5f6089cf069ca74f9b1faa685462aa8d0dd5e72a306b2 not found: ID does not exist" containerID="76319e80e00bcd06b7a5f6089cf069ca74f9b1faa685462aa8d0dd5e72a306b2" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.477498 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76319e80e00bcd06b7a5f6089cf069ca74f9b1faa685462aa8d0dd5e72a306b2"} err="failed to get container status \"76319e80e00bcd06b7a5f6089cf069ca74f9b1faa685462aa8d0dd5e72a306b2\": rpc error: code = NotFound desc = could not find container \"76319e80e00bcd06b7a5f6089cf069ca74f9b1faa685462aa8d0dd5e72a306b2\": container with ID starting with 76319e80e00bcd06b7a5f6089cf069ca74f9b1faa685462aa8d0dd5e72a306b2 not found: ID does not exist" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.477509 4744 scope.go:117] "RemoveContainer" containerID="f827d7b888ca1031c0186ebd688af3ff5d7821d02f48a0851beb75c12709efb2" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.477658 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f827d7b888ca1031c0186ebd688af3ff5d7821d02f48a0851beb75c12709efb2\": container with ID starting with f827d7b888ca1031c0186ebd688af3ff5d7821d02f48a0851beb75c12709efb2 not found: ID does not exist" containerID="f827d7b888ca1031c0186ebd688af3ff5d7821d02f48a0851beb75c12709efb2" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.477671 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f827d7b888ca1031c0186ebd688af3ff5d7821d02f48a0851beb75c12709efb2"} err="failed to get container status \"f827d7b888ca1031c0186ebd688af3ff5d7821d02f48a0851beb75c12709efb2\": rpc error: code = NotFound desc = could not find container \"f827d7b888ca1031c0186ebd688af3ff5d7821d02f48a0851beb75c12709efb2\": container with ID starting with f827d7b888ca1031c0186ebd688af3ff5d7821d02f48a0851beb75c12709efb2 not found: ID does not exist" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.477682 4744 scope.go:117] "RemoveContainer" containerID="d31e745c91e3262ef9a281821a591e85878133c5354b36cf9d09d82c8aff0123" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.490685 4744 scope.go:117] "RemoveContainer" containerID="9c7bf640e7b8f0575889e7daa44788ec557b7392eff3dc1d69a5c390688c4e1d" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.509921 4744 scope.go:117] "RemoveContainer" containerID="d31e745c91e3262ef9a281821a591e85878133c5354b36cf9d09d82c8aff0123" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.510278 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d31e745c91e3262ef9a281821a591e85878133c5354b36cf9d09d82c8aff0123\": container with ID starting with d31e745c91e3262ef9a281821a591e85878133c5354b36cf9d09d82c8aff0123 not found: ID does not exist" containerID="d31e745c91e3262ef9a281821a591e85878133c5354b36cf9d09d82c8aff0123" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.510335 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d31e745c91e3262ef9a281821a591e85878133c5354b36cf9d09d82c8aff0123"} err="failed to get container status \"d31e745c91e3262ef9a281821a591e85878133c5354b36cf9d09d82c8aff0123\": rpc error: code = NotFound desc = could not find container \"d31e745c91e3262ef9a281821a591e85878133c5354b36cf9d09d82c8aff0123\": container with ID starting with d31e745c91e3262ef9a281821a591e85878133c5354b36cf9d09d82c8aff0123 not found: ID does not exist" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.510365 4744 scope.go:117] "RemoveContainer" containerID="9c7bf640e7b8f0575889e7daa44788ec557b7392eff3dc1d69a5c390688c4e1d" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.510727 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c7bf640e7b8f0575889e7daa44788ec557b7392eff3dc1d69a5c390688c4e1d\": container with ID starting with 9c7bf640e7b8f0575889e7daa44788ec557b7392eff3dc1d69a5c390688c4e1d not found: ID does not exist" containerID="9c7bf640e7b8f0575889e7daa44788ec557b7392eff3dc1d69a5c390688c4e1d" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.510745 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c7bf640e7b8f0575889e7daa44788ec557b7392eff3dc1d69a5c390688c4e1d"} err="failed to get container status \"9c7bf640e7b8f0575889e7daa44788ec557b7392eff3dc1d69a5c390688c4e1d\": rpc error: code = NotFound desc = could not find container \"9c7bf640e7b8f0575889e7daa44788ec557b7392eff3dc1d69a5c390688c4e1d\": container with ID starting with 9c7bf640e7b8f0575889e7daa44788ec557b7392eff3dc1d69a5c390688c4e1d not found: ID does not exist" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.510759 4744 scope.go:117] "RemoveContainer" containerID="2b77e22a6982778d23d7eae56b7085a1c12dfd134a958caa64c474628e19d0c5" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.524933 4744 scope.go:117] "RemoveContainer" containerID="410f0bbc23e0e8be37f4332a6320d8d973060e0c7574c9c8885e94522b5de5b7" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.538813 4744 scope.go:117] "RemoveContainer" containerID="e00be5f186decc669c908d316f8433d921596c459b782b0d7f6cc83d4afef2b0" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.577969 4744 scope.go:117] "RemoveContainer" containerID="2b77e22a6982778d23d7eae56b7085a1c12dfd134a958caa64c474628e19d0c5" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.580789 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b77e22a6982778d23d7eae56b7085a1c12dfd134a958caa64c474628e19d0c5\": container with ID starting with 2b77e22a6982778d23d7eae56b7085a1c12dfd134a958caa64c474628e19d0c5 not found: ID does not exist" containerID="2b77e22a6982778d23d7eae56b7085a1c12dfd134a958caa64c474628e19d0c5" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.580848 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b77e22a6982778d23d7eae56b7085a1c12dfd134a958caa64c474628e19d0c5"} err="failed to get container status \"2b77e22a6982778d23d7eae56b7085a1c12dfd134a958caa64c474628e19d0c5\": rpc error: code = NotFound desc = could not find container \"2b77e22a6982778d23d7eae56b7085a1c12dfd134a958caa64c474628e19d0c5\": container with ID starting with 2b77e22a6982778d23d7eae56b7085a1c12dfd134a958caa64c474628e19d0c5 not found: ID does not exist" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.580882 4744 scope.go:117] "RemoveContainer" containerID="410f0bbc23e0e8be37f4332a6320d8d973060e0c7574c9c8885e94522b5de5b7" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.581415 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"410f0bbc23e0e8be37f4332a6320d8d973060e0c7574c9c8885e94522b5de5b7\": container with ID starting with 410f0bbc23e0e8be37f4332a6320d8d973060e0c7574c9c8885e94522b5de5b7 not found: ID does not exist" containerID="410f0bbc23e0e8be37f4332a6320d8d973060e0c7574c9c8885e94522b5de5b7" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.581453 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"410f0bbc23e0e8be37f4332a6320d8d973060e0c7574c9c8885e94522b5de5b7"} err="failed to get container status \"410f0bbc23e0e8be37f4332a6320d8d973060e0c7574c9c8885e94522b5de5b7\": rpc error: code = NotFound desc = could not find container \"410f0bbc23e0e8be37f4332a6320d8d973060e0c7574c9c8885e94522b5de5b7\": container with ID starting with 410f0bbc23e0e8be37f4332a6320d8d973060e0c7574c9c8885e94522b5de5b7 not found: ID does not exist" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.581480 4744 scope.go:117] "RemoveContainer" containerID="e00be5f186decc669c908d316f8433d921596c459b782b0d7f6cc83d4afef2b0" Dec 05 20:16:35 crc kubenswrapper[4744]: E1205 20:16:35.581826 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e00be5f186decc669c908d316f8433d921596c459b782b0d7f6cc83d4afef2b0\": container with ID starting with e00be5f186decc669c908d316f8433d921596c459b782b0d7f6cc83d4afef2b0 not found: ID does not exist" containerID="e00be5f186decc669c908d316f8433d921596c459b782b0d7f6cc83d4afef2b0" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.581881 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e00be5f186decc669c908d316f8433d921596c459b782b0d7f6cc83d4afef2b0"} err="failed to get container status \"e00be5f186decc669c908d316f8433d921596c459b782b0d7f6cc83d4afef2b0\": rpc error: code = NotFound desc = could not find container \"e00be5f186decc669c908d316f8433d921596c459b782b0d7f6cc83d4afef2b0\": container with ID starting with e00be5f186decc669c908d316f8433d921596c459b782b0d7f6cc83d4afef2b0 not found: ID does not exist" Dec 05 20:16:35 crc kubenswrapper[4744]: I1205 20:16:35.616444 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7687c74b75-cjptt"] Dec 05 20:16:35 crc kubenswrapper[4744]: W1205 20:16:35.626393 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5dc9d81_3024_44d6_b86a_1fca22004385.slice/crio-441069d821f0a2767441e0f00cf028c82518c9fb947f0119712bf73946d9450e WatchSource:0}: Error finding container 441069d821f0a2767441e0f00cf028c82518c9fb947f0119712bf73946d9450e: Status 404 returned error can't find the container with id 441069d821f0a2767441e0f00cf028c82518c9fb947f0119712bf73946d9450e Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.089989 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db367c1-8f1b-4096-9f23-5a3d14d3980f" path="/var/lib/kubelet/pods/2db367c1-8f1b-4096-9f23-5a3d14d3980f/volumes" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.090886 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3de19905-69da-4470-a77e-341a5e71412f" path="/var/lib/kubelet/pods/3de19905-69da-4470-a77e-341a5e71412f/volumes" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.091348 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="649cba80-0f59-449e-8a48-fbb1b4d373e3" path="/var/lib/kubelet/pods/649cba80-0f59-449e-8a48-fbb1b4d373e3/volumes" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.092416 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bfdca92-a782-4806-a2c0-e54302fd24a4" path="/var/lib/kubelet/pods/8bfdca92-a782-4806-a2c0-e54302fd24a4/volumes" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.092851 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f07b8700-0120-4aa2-bd07-8a6f06d8dbf8" path="/var/lib/kubelet/pods/f07b8700-0120-4aa2-bd07-8a6f06d8dbf8/volumes" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.094022 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f76f1c47-c74d-46cb-ad16-db7392a47a9b" path="/var/lib/kubelet/pods/f76f1c47-c74d-46cb-ad16-db7392a47a9b/volumes" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.178935 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4qjk5"] Dec 05 20:16:36 crc kubenswrapper[4744]: E1205 20:16:36.179111 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bfdca92-a782-4806-a2c0-e54302fd24a4" containerName="marketplace-operator" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.179122 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bfdca92-a782-4806-a2c0-e54302fd24a4" containerName="marketplace-operator" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.179211 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bfdca92-a782-4806-a2c0-e54302fd24a4" containerName="marketplace-operator" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.179879 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.181772 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.196253 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4qjk5"] Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.280262 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" event={"ID":"d5dc9d81-3024-44d6-b86a-1fca22004385","Type":"ContainerStarted","Data":"f0afdc70b57c18a7db475eff2b236924e662543f60468b0dee38ef20b6ad74ba"} Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.280311 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" event={"ID":"d5dc9d81-3024-44d6-b86a-1fca22004385","Type":"ContainerStarted","Data":"441069d821f0a2767441e0f00cf028c82518c9fb947f0119712bf73946d9450e"} Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.281371 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.284593 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.298968 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7687c74b75-cjptt" podStartSLOduration=3.2989479839999998 podStartE2EDuration="3.298947984s" podCreationTimestamp="2025-12-05 20:16:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:16:36.297377414 +0000 UTC m=+366.527188802" watchObservedRunningTime="2025-12-05 20:16:36.298947984 +0000 UTC m=+366.528759342" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.354673 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082710f4-5dbe-49a3-a13a-1cc99036f530-utilities\") pod \"certified-operators-4qjk5\" (UID: \"082710f4-5dbe-49a3-a13a-1cc99036f530\") " pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.354711 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082710f4-5dbe-49a3-a13a-1cc99036f530-catalog-content\") pod \"certified-operators-4qjk5\" (UID: \"082710f4-5dbe-49a3-a13a-1cc99036f530\") " pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.354761 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6bkk\" (UniqueName: \"kubernetes.io/projected/082710f4-5dbe-49a3-a13a-1cc99036f530-kube-api-access-z6bkk\") pod \"certified-operators-4qjk5\" (UID: \"082710f4-5dbe-49a3-a13a-1cc99036f530\") " pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.455483 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082710f4-5dbe-49a3-a13a-1cc99036f530-utilities\") pod \"certified-operators-4qjk5\" (UID: \"082710f4-5dbe-49a3-a13a-1cc99036f530\") " pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.455603 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082710f4-5dbe-49a3-a13a-1cc99036f530-catalog-content\") pod \"certified-operators-4qjk5\" (UID: \"082710f4-5dbe-49a3-a13a-1cc99036f530\") " pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.455637 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6bkk\" (UniqueName: \"kubernetes.io/projected/082710f4-5dbe-49a3-a13a-1cc99036f530-kube-api-access-z6bkk\") pod \"certified-operators-4qjk5\" (UID: \"082710f4-5dbe-49a3-a13a-1cc99036f530\") " pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.456053 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082710f4-5dbe-49a3-a13a-1cc99036f530-utilities\") pod \"certified-operators-4qjk5\" (UID: \"082710f4-5dbe-49a3-a13a-1cc99036f530\") " pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.456209 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082710f4-5dbe-49a3-a13a-1cc99036f530-catalog-content\") pod \"certified-operators-4qjk5\" (UID: \"082710f4-5dbe-49a3-a13a-1cc99036f530\") " pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.475850 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6bkk\" (UniqueName: \"kubernetes.io/projected/082710f4-5dbe-49a3-a13a-1cc99036f530-kube-api-access-z6bkk\") pod \"certified-operators-4qjk5\" (UID: \"082710f4-5dbe-49a3-a13a-1cc99036f530\") " pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.499038 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.774730 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8x5wd"] Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.775913 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8x5wd" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.777971 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.792909 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8x5wd"] Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.860528 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7078c58-7b34-4700-83ab-2b104f662fff-catalog-content\") pod \"redhat-marketplace-8x5wd\" (UID: \"f7078c58-7b34-4700-83ab-2b104f662fff\") " pod="openshift-marketplace/redhat-marketplace-8x5wd" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.860813 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4klv\" (UniqueName: \"kubernetes.io/projected/f7078c58-7b34-4700-83ab-2b104f662fff-kube-api-access-w4klv\") pod \"redhat-marketplace-8x5wd\" (UID: \"f7078c58-7b34-4700-83ab-2b104f662fff\") " pod="openshift-marketplace/redhat-marketplace-8x5wd" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.861051 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7078c58-7b34-4700-83ab-2b104f662fff-utilities\") pod \"redhat-marketplace-8x5wd\" (UID: \"f7078c58-7b34-4700-83ab-2b104f662fff\") " pod="openshift-marketplace/redhat-marketplace-8x5wd" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.917433 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4qjk5"] Dec 05 20:16:36 crc kubenswrapper[4744]: W1205 20:16:36.927735 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod082710f4_5dbe_49a3_a13a_1cc99036f530.slice/crio-a008fd53811f986e7fdced29ab3bc47d1801b04e9571475b0c6c9fd2a794ccd4 WatchSource:0}: Error finding container a008fd53811f986e7fdced29ab3bc47d1801b04e9571475b0c6c9fd2a794ccd4: Status 404 returned error can't find the container with id a008fd53811f986e7fdced29ab3bc47d1801b04e9571475b0c6c9fd2a794ccd4 Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.963001 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4klv\" (UniqueName: \"kubernetes.io/projected/f7078c58-7b34-4700-83ab-2b104f662fff-kube-api-access-w4klv\") pod \"redhat-marketplace-8x5wd\" (UID: \"f7078c58-7b34-4700-83ab-2b104f662fff\") " pod="openshift-marketplace/redhat-marketplace-8x5wd" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.963160 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7078c58-7b34-4700-83ab-2b104f662fff-utilities\") pod \"redhat-marketplace-8x5wd\" (UID: \"f7078c58-7b34-4700-83ab-2b104f662fff\") " pod="openshift-marketplace/redhat-marketplace-8x5wd" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.963204 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7078c58-7b34-4700-83ab-2b104f662fff-catalog-content\") pod \"redhat-marketplace-8x5wd\" (UID: \"f7078c58-7b34-4700-83ab-2b104f662fff\") " pod="openshift-marketplace/redhat-marketplace-8x5wd" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.963621 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7078c58-7b34-4700-83ab-2b104f662fff-utilities\") pod \"redhat-marketplace-8x5wd\" (UID: \"f7078c58-7b34-4700-83ab-2b104f662fff\") " pod="openshift-marketplace/redhat-marketplace-8x5wd" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.963777 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7078c58-7b34-4700-83ab-2b104f662fff-catalog-content\") pod \"redhat-marketplace-8x5wd\" (UID: \"f7078c58-7b34-4700-83ab-2b104f662fff\") " pod="openshift-marketplace/redhat-marketplace-8x5wd" Dec 05 20:16:36 crc kubenswrapper[4744]: I1205 20:16:36.987538 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4klv\" (UniqueName: \"kubernetes.io/projected/f7078c58-7b34-4700-83ab-2b104f662fff-kube-api-access-w4klv\") pod \"redhat-marketplace-8x5wd\" (UID: \"f7078c58-7b34-4700-83ab-2b104f662fff\") " pod="openshift-marketplace/redhat-marketplace-8x5wd" Dec 05 20:16:37 crc kubenswrapper[4744]: I1205 20:16:37.095497 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8x5wd" Dec 05 20:16:37 crc kubenswrapper[4744]: I1205 20:16:37.298368 4744 generic.go:334] "Generic (PLEG): container finished" podID="082710f4-5dbe-49a3-a13a-1cc99036f530" containerID="52645b7018014bfa5160b33867863d514fbe5296704413d262c6a2ca400f6fdf" exitCode=0 Dec 05 20:16:37 crc kubenswrapper[4744]: I1205 20:16:37.298461 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qjk5" event={"ID":"082710f4-5dbe-49a3-a13a-1cc99036f530","Type":"ContainerDied","Data":"52645b7018014bfa5160b33867863d514fbe5296704413d262c6a2ca400f6fdf"} Dec 05 20:16:37 crc kubenswrapper[4744]: I1205 20:16:37.298647 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qjk5" event={"ID":"082710f4-5dbe-49a3-a13a-1cc99036f530","Type":"ContainerStarted","Data":"a008fd53811f986e7fdced29ab3bc47d1801b04e9571475b0c6c9fd2a794ccd4"} Dec 05 20:16:37 crc kubenswrapper[4744]: I1205 20:16:37.548757 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8x5wd"] Dec 05 20:16:37 crc kubenswrapper[4744]: W1205 20:16:37.559085 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7078c58_7b34_4700_83ab_2b104f662fff.slice/crio-e8be515a5c993559dadcb5059d10f0bd1d226b27928189dac9d5fbfef94d743a WatchSource:0}: Error finding container e8be515a5c993559dadcb5059d10f0bd1d226b27928189dac9d5fbfef94d743a: Status 404 returned error can't find the container with id e8be515a5c993559dadcb5059d10f0bd1d226b27928189dac9d5fbfef94d743a Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.308675 4744 generic.go:334] "Generic (PLEG): container finished" podID="f7078c58-7b34-4700-83ab-2b104f662fff" containerID="00741cd0f747af1d547af50c36154a88ac644b7aec2968a0f296407e3e453d9a" exitCode=0 Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.309373 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8x5wd" event={"ID":"f7078c58-7b34-4700-83ab-2b104f662fff","Type":"ContainerDied","Data":"00741cd0f747af1d547af50c36154a88ac644b7aec2968a0f296407e3e453d9a"} Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.309689 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8x5wd" event={"ID":"f7078c58-7b34-4700-83ab-2b104f662fff","Type":"ContainerStarted","Data":"e8be515a5c993559dadcb5059d10f0bd1d226b27928189dac9d5fbfef94d743a"} Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.579025 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mx8q8"] Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.580307 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mx8q8" Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.582448 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.593153 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mx8q8"] Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.684856 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nnww\" (UniqueName: \"kubernetes.io/projected/b7611a0a-e6bd-4051-bf2d-b3c28e86d91b-kube-api-access-7nnww\") pod \"redhat-operators-mx8q8\" (UID: \"b7611a0a-e6bd-4051-bf2d-b3c28e86d91b\") " pod="openshift-marketplace/redhat-operators-mx8q8" Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.684944 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7611a0a-e6bd-4051-bf2d-b3c28e86d91b-catalog-content\") pod \"redhat-operators-mx8q8\" (UID: \"b7611a0a-e6bd-4051-bf2d-b3c28e86d91b\") " pod="openshift-marketplace/redhat-operators-mx8q8" Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.685004 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7611a0a-e6bd-4051-bf2d-b3c28e86d91b-utilities\") pod \"redhat-operators-mx8q8\" (UID: \"b7611a0a-e6bd-4051-bf2d-b3c28e86d91b\") " pod="openshift-marketplace/redhat-operators-mx8q8" Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.786257 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nnww\" (UniqueName: \"kubernetes.io/projected/b7611a0a-e6bd-4051-bf2d-b3c28e86d91b-kube-api-access-7nnww\") pod \"redhat-operators-mx8q8\" (UID: \"b7611a0a-e6bd-4051-bf2d-b3c28e86d91b\") " pod="openshift-marketplace/redhat-operators-mx8q8" Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.786367 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7611a0a-e6bd-4051-bf2d-b3c28e86d91b-catalog-content\") pod \"redhat-operators-mx8q8\" (UID: \"b7611a0a-e6bd-4051-bf2d-b3c28e86d91b\") " pod="openshift-marketplace/redhat-operators-mx8q8" Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.786407 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7611a0a-e6bd-4051-bf2d-b3c28e86d91b-utilities\") pod \"redhat-operators-mx8q8\" (UID: \"b7611a0a-e6bd-4051-bf2d-b3c28e86d91b\") " pod="openshift-marketplace/redhat-operators-mx8q8" Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.786903 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7611a0a-e6bd-4051-bf2d-b3c28e86d91b-utilities\") pod \"redhat-operators-mx8q8\" (UID: \"b7611a0a-e6bd-4051-bf2d-b3c28e86d91b\") " pod="openshift-marketplace/redhat-operators-mx8q8" Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.787502 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7611a0a-e6bd-4051-bf2d-b3c28e86d91b-catalog-content\") pod \"redhat-operators-mx8q8\" (UID: \"b7611a0a-e6bd-4051-bf2d-b3c28e86d91b\") " pod="openshift-marketplace/redhat-operators-mx8q8" Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.806507 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nnww\" (UniqueName: \"kubernetes.io/projected/b7611a0a-e6bd-4051-bf2d-b3c28e86d91b-kube-api-access-7nnww\") pod \"redhat-operators-mx8q8\" (UID: \"b7611a0a-e6bd-4051-bf2d-b3c28e86d91b\") " pod="openshift-marketplace/redhat-operators-mx8q8" Dec 05 20:16:38 crc kubenswrapper[4744]: I1205 20:16:38.945130 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mx8q8" Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.182063 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nqwbs"] Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.183962 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqwbs" Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.185873 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.186227 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqwbs"] Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.300358 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293-utilities\") pod \"community-operators-nqwbs\" (UID: \"fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293\") " pod="openshift-marketplace/community-operators-nqwbs" Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.300399 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293-catalog-content\") pod \"community-operators-nqwbs\" (UID: \"fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293\") " pod="openshift-marketplace/community-operators-nqwbs" Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.300427 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9gnj\" (UniqueName: \"kubernetes.io/projected/fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293-kube-api-access-b9gnj\") pod \"community-operators-nqwbs\" (UID: \"fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293\") " pod="openshift-marketplace/community-operators-nqwbs" Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.318191 4744 generic.go:334] "Generic (PLEG): container finished" podID="082710f4-5dbe-49a3-a13a-1cc99036f530" containerID="a08cfa27ca8e3b8aa019e0779774760fec6d8aa86e8900b12de981b64814b716" exitCode=0 Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.318271 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qjk5" event={"ID":"082710f4-5dbe-49a3-a13a-1cc99036f530","Type":"ContainerDied","Data":"a08cfa27ca8e3b8aa019e0779774760fec6d8aa86e8900b12de981b64814b716"} Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.321118 4744 generic.go:334] "Generic (PLEG): container finished" podID="f7078c58-7b34-4700-83ab-2b104f662fff" containerID="ac85c40e6cbb4b4d8b2c3a1f69f88a1a003e3792488b40c1c50b9c00a1cafac4" exitCode=0 Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.321163 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8x5wd" event={"ID":"f7078c58-7b34-4700-83ab-2b104f662fff","Type":"ContainerDied","Data":"ac85c40e6cbb4b4d8b2c3a1f69f88a1a003e3792488b40c1c50b9c00a1cafac4"} Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.366990 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mx8q8"] Dec 05 20:16:39 crc kubenswrapper[4744]: W1205 20:16:39.375134 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7611a0a_e6bd_4051_bf2d_b3c28e86d91b.slice/crio-7b42b08556efd2d3a008960f4f56bf6c155fbcc506257aa6ed6e931d8de31837 WatchSource:0}: Error finding container 7b42b08556efd2d3a008960f4f56bf6c155fbcc506257aa6ed6e931d8de31837: Status 404 returned error can't find the container with id 7b42b08556efd2d3a008960f4f56bf6c155fbcc506257aa6ed6e931d8de31837 Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.402153 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293-utilities\") pod \"community-operators-nqwbs\" (UID: \"fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293\") " pod="openshift-marketplace/community-operators-nqwbs" Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.402206 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293-catalog-content\") pod \"community-operators-nqwbs\" (UID: \"fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293\") " pod="openshift-marketplace/community-operators-nqwbs" Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.402655 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293-utilities\") pod \"community-operators-nqwbs\" (UID: \"fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293\") " pod="openshift-marketplace/community-operators-nqwbs" Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.402782 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293-catalog-content\") pod \"community-operators-nqwbs\" (UID: \"fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293\") " pod="openshift-marketplace/community-operators-nqwbs" Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.402835 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9gnj\" (UniqueName: \"kubernetes.io/projected/fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293-kube-api-access-b9gnj\") pod \"community-operators-nqwbs\" (UID: \"fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293\") " pod="openshift-marketplace/community-operators-nqwbs" Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.423644 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9gnj\" (UniqueName: \"kubernetes.io/projected/fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293-kube-api-access-b9gnj\") pod \"community-operators-nqwbs\" (UID: \"fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293\") " pod="openshift-marketplace/community-operators-nqwbs" Dec 05 20:16:39 crc kubenswrapper[4744]: I1205 20:16:39.518770 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqwbs" Dec 05 20:16:40 crc kubenswrapper[4744]: I1205 20:16:40.027563 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqwbs"] Dec 05 20:16:40 crc kubenswrapper[4744]: W1205 20:16:40.035945 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd4f5e2f_4f29_4d8a_ab50_a9fd969fe293.slice/crio-87285e9d62d776d2bbc4e12525801a6303eb0ec70b69041cdf2596abae0a6521 WatchSource:0}: Error finding container 87285e9d62d776d2bbc4e12525801a6303eb0ec70b69041cdf2596abae0a6521: Status 404 returned error can't find the container with id 87285e9d62d776d2bbc4e12525801a6303eb0ec70b69041cdf2596abae0a6521 Dec 05 20:16:40 crc kubenswrapper[4744]: I1205 20:16:40.326539 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqwbs" event={"ID":"fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293","Type":"ContainerStarted","Data":"87285e9d62d776d2bbc4e12525801a6303eb0ec70b69041cdf2596abae0a6521"} Dec 05 20:16:40 crc kubenswrapper[4744]: I1205 20:16:40.328151 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx8q8" event={"ID":"b7611a0a-e6bd-4051-bf2d-b3c28e86d91b","Type":"ContainerStarted","Data":"8181b9eca454425a86d4262a4c948d8714d865c41f24eeeaebad500317d9b33c"} Dec 05 20:16:40 crc kubenswrapper[4744]: I1205 20:16:40.328189 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx8q8" event={"ID":"b7611a0a-e6bd-4051-bf2d-b3c28e86d91b","Type":"ContainerStarted","Data":"7b42b08556efd2d3a008960f4f56bf6c155fbcc506257aa6ed6e931d8de31837"} Dec 05 20:16:42 crc kubenswrapper[4744]: I1205 20:16:42.348838 4744 generic.go:334] "Generic (PLEG): container finished" podID="b7611a0a-e6bd-4051-bf2d-b3c28e86d91b" containerID="8181b9eca454425a86d4262a4c948d8714d865c41f24eeeaebad500317d9b33c" exitCode=0 Dec 05 20:16:42 crc kubenswrapper[4744]: I1205 20:16:42.348997 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx8q8" event={"ID":"b7611a0a-e6bd-4051-bf2d-b3c28e86d91b","Type":"ContainerDied","Data":"8181b9eca454425a86d4262a4c948d8714d865c41f24eeeaebad500317d9b33c"} Dec 05 20:16:46 crc kubenswrapper[4744]: I1205 20:16:46.371523 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8x5wd" event={"ID":"f7078c58-7b34-4700-83ab-2b104f662fff","Type":"ContainerStarted","Data":"04ad4c96f1f4940984991bc70597e3c2477b745206648f0fbf655d86ca5640d8"} Dec 05 20:16:46 crc kubenswrapper[4744]: I1205 20:16:46.375370 4744 generic.go:334] "Generic (PLEG): container finished" podID="fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293" containerID="d7f9a790cbc5c270de85f055a2a479ae5c6f9f55b73c825e6ff03ed212c6ff20" exitCode=0 Dec 05 20:16:46 crc kubenswrapper[4744]: I1205 20:16:46.375407 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqwbs" event={"ID":"fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293","Type":"ContainerDied","Data":"d7f9a790cbc5c270de85f055a2a479ae5c6f9f55b73c825e6ff03ed212c6ff20"} Dec 05 20:16:46 crc kubenswrapper[4744]: I1205 20:16:46.378763 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qjk5" event={"ID":"082710f4-5dbe-49a3-a13a-1cc99036f530","Type":"ContainerStarted","Data":"309e7350dcb2714350f75e77034b6657ef6172335cabfdaef4787603078ba4a9"} Dec 05 20:16:46 crc kubenswrapper[4744]: I1205 20:16:46.400067 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8x5wd" podStartSLOduration=2.975890673 podStartE2EDuration="10.400046718s" podCreationTimestamp="2025-12-05 20:16:36 +0000 UTC" firstStartedPulling="2025-12-05 20:16:38.363694999 +0000 UTC m=+368.593506377" lastFinishedPulling="2025-12-05 20:16:45.787851054 +0000 UTC m=+376.017662422" observedRunningTime="2025-12-05 20:16:46.397538454 +0000 UTC m=+376.627349842" watchObservedRunningTime="2025-12-05 20:16:46.400046718 +0000 UTC m=+376.629858106" Dec 05 20:16:46 crc kubenswrapper[4744]: I1205 20:16:46.454979 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4qjk5" podStartSLOduration=2.003362831 podStartE2EDuration="10.454964646s" podCreationTimestamp="2025-12-05 20:16:36 +0000 UTC" firstStartedPulling="2025-12-05 20:16:37.299884824 +0000 UTC m=+367.529696192" lastFinishedPulling="2025-12-05 20:16:45.751486629 +0000 UTC m=+375.981298007" observedRunningTime="2025-12-05 20:16:46.435409988 +0000 UTC m=+376.665221366" watchObservedRunningTime="2025-12-05 20:16:46.454964646 +0000 UTC m=+376.684776014" Dec 05 20:16:46 crc kubenswrapper[4744]: I1205 20:16:46.499843 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:16:46 crc kubenswrapper[4744]: I1205 20:16:46.499891 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:16:47 crc kubenswrapper[4744]: I1205 20:16:47.096163 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8x5wd" Dec 05 20:16:47 crc kubenswrapper[4744]: I1205 20:16:47.096212 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8x5wd" Dec 05 20:16:47 crc kubenswrapper[4744]: I1205 20:16:47.480637 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-628ml" podUID="98e5f65e-632c-4932-83cc-413ea5cac23a" containerName="registry" containerID="cri-o://bf768afeefae123d6258722c487cd5f64a0e76c00ad25d9ced41b23a3071cb8a" gracePeriod=30 Dec 05 20:16:47 crc kubenswrapper[4744]: I1205 20:16:47.535965 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-4qjk5" podUID="082710f4-5dbe-49a3-a13a-1cc99036f530" containerName="registry-server" probeResult="failure" output=< Dec 05 20:16:47 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Dec 05 20:16:47 crc kubenswrapper[4744]: > Dec 05 20:16:48 crc kubenswrapper[4744]: I1205 20:16:48.181646 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-8x5wd" podUID="f7078c58-7b34-4700-83ab-2b104f662fff" containerName="registry-server" probeResult="failure" output=< Dec 05 20:16:48 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Dec 05 20:16:48 crc kubenswrapper[4744]: > Dec 05 20:16:48 crc kubenswrapper[4744]: I1205 20:16:48.743511 4744 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-qv6mb container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 20:16:48 crc kubenswrapper[4744]: I1205 20:16:48.743599 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-qv6mb" podUID="2fd9747b-ba54-4fa6-8849-7447d6683c68" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 20:16:49 crc kubenswrapper[4744]: I1205 20:16:49.807132 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:16:49 crc kubenswrapper[4744]: I1205 20:16:49.807660 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:16:50 crc kubenswrapper[4744]: I1205 20:16:50.405761 4744 generic.go:334] "Generic (PLEG): container finished" podID="98e5f65e-632c-4932-83cc-413ea5cac23a" containerID="bf768afeefae123d6258722c487cd5f64a0e76c00ad25d9ced41b23a3071cb8a" exitCode=0 Dec 05 20:16:50 crc kubenswrapper[4744]: I1205 20:16:50.405811 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-628ml" event={"ID":"98e5f65e-632c-4932-83cc-413ea5cac23a","Type":"ContainerDied","Data":"bf768afeefae123d6258722c487cd5f64a0e76c00ad25d9ced41b23a3071cb8a"} Dec 05 20:16:52 crc kubenswrapper[4744]: I1205 20:16:52.424381 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqwbs" event={"ID":"fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293","Type":"ContainerStarted","Data":"d62d63575a539baac72372334e04ccffa16cab2479d23cb6d880b345f905f75c"} Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.275340 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.363813 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98e5f65e-632c-4932-83cc-413ea5cac23a-trusted-ca\") pod \"98e5f65e-632c-4932-83cc-413ea5cac23a\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.363873 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/98e5f65e-632c-4932-83cc-413ea5cac23a-ca-trust-extracted\") pod \"98e5f65e-632c-4932-83cc-413ea5cac23a\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.363901 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/98e5f65e-632c-4932-83cc-413ea5cac23a-registry-certificates\") pod \"98e5f65e-632c-4932-83cc-413ea5cac23a\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.363952 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/98e5f65e-632c-4932-83cc-413ea5cac23a-installation-pull-secrets\") pod \"98e5f65e-632c-4932-83cc-413ea5cac23a\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.363991 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-registry-tls\") pod \"98e5f65e-632c-4932-83cc-413ea5cac23a\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.364012 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-bound-sa-token\") pod \"98e5f65e-632c-4932-83cc-413ea5cac23a\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.364177 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"98e5f65e-632c-4932-83cc-413ea5cac23a\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.364215 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxzn7\" (UniqueName: \"kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-kube-api-access-hxzn7\") pod \"98e5f65e-632c-4932-83cc-413ea5cac23a\" (UID: \"98e5f65e-632c-4932-83cc-413ea5cac23a\") " Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.366494 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98e5f65e-632c-4932-83cc-413ea5cac23a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "98e5f65e-632c-4932-83cc-413ea5cac23a" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.367906 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98e5f65e-632c-4932-83cc-413ea5cac23a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "98e5f65e-632c-4932-83cc-413ea5cac23a" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.375046 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "98e5f65e-632c-4932-83cc-413ea5cac23a" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.375109 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e5f65e-632c-4932-83cc-413ea5cac23a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "98e5f65e-632c-4932-83cc-413ea5cac23a" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.378075 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "98e5f65e-632c-4932-83cc-413ea5cac23a" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.386445 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "98e5f65e-632c-4932-83cc-413ea5cac23a" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.388807 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98e5f65e-632c-4932-83cc-413ea5cac23a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "98e5f65e-632c-4932-83cc-413ea5cac23a" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.390285 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-kube-api-access-hxzn7" (OuterVolumeSpecName: "kube-api-access-hxzn7") pod "98e5f65e-632c-4932-83cc-413ea5cac23a" (UID: "98e5f65e-632c-4932-83cc-413ea5cac23a"). InnerVolumeSpecName "kube-api-access-hxzn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.435829 4744 generic.go:334] "Generic (PLEG): container finished" podID="fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293" containerID="d62d63575a539baac72372334e04ccffa16cab2479d23cb6d880b345f905f75c" exitCode=0 Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.435900 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqwbs" event={"ID":"fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293","Type":"ContainerDied","Data":"d62d63575a539baac72372334e04ccffa16cab2479d23cb6d880b345f905f75c"} Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.437235 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-628ml" event={"ID":"98e5f65e-632c-4932-83cc-413ea5cac23a","Type":"ContainerDied","Data":"340cb9599d2a21e3c856eac949c4ac0763252c4ec8a3dbf1588903c194cdda26"} Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.437269 4744 scope.go:117] "RemoveContainer" containerID="bf768afeefae123d6258722c487cd5f64a0e76c00ad25d9ced41b23a3071cb8a" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.437365 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-628ml" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.446098 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx8q8" event={"ID":"b7611a0a-e6bd-4051-bf2d-b3c28e86d91b","Type":"ContainerStarted","Data":"5705253799aabc4bb1e04ed7265b9d423857c1f89842b684330eaf6fe9f2e238"} Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.466251 4744 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.466280 4744 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.466303 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxzn7\" (UniqueName: \"kubernetes.io/projected/98e5f65e-632c-4932-83cc-413ea5cac23a-kube-api-access-hxzn7\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.466313 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98e5f65e-632c-4932-83cc-413ea5cac23a-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.466322 4744 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/98e5f65e-632c-4932-83cc-413ea5cac23a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.466330 4744 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/98e5f65e-632c-4932-83cc-413ea5cac23a-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.466339 4744 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/98e5f65e-632c-4932-83cc-413ea5cac23a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.491705 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-628ml"] Dec 05 20:16:54 crc kubenswrapper[4744]: I1205 20:16:54.495431 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-628ml"] Dec 05 20:16:55 crc kubenswrapper[4744]: I1205 20:16:55.457167 4744 generic.go:334] "Generic (PLEG): container finished" podID="b7611a0a-e6bd-4051-bf2d-b3c28e86d91b" containerID="5705253799aabc4bb1e04ed7265b9d423857c1f89842b684330eaf6fe9f2e238" exitCode=0 Dec 05 20:16:55 crc kubenswrapper[4744]: I1205 20:16:55.457231 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx8q8" event={"ID":"b7611a0a-e6bd-4051-bf2d-b3c28e86d91b","Type":"ContainerDied","Data":"5705253799aabc4bb1e04ed7265b9d423857c1f89842b684330eaf6fe9f2e238"} Dec 05 20:16:55 crc kubenswrapper[4744]: I1205 20:16:55.482592 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqwbs" event={"ID":"fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293","Type":"ContainerStarted","Data":"41753d6820a2ba6039149dc7697cea46534c95eb733f5f2b79061befe31d6af6"} Dec 05 20:16:55 crc kubenswrapper[4744]: I1205 20:16:55.511888 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nqwbs" podStartSLOduration=7.920965719 podStartE2EDuration="16.511866605s" podCreationTimestamp="2025-12-05 20:16:39 +0000 UTC" firstStartedPulling="2025-12-05 20:16:46.377270446 +0000 UTC m=+376.607081804" lastFinishedPulling="2025-12-05 20:16:54.968171322 +0000 UTC m=+385.197982690" observedRunningTime="2025-12-05 20:16:55.506897117 +0000 UTC m=+385.736708515" watchObservedRunningTime="2025-12-05 20:16:55.511866605 +0000 UTC m=+385.741677983" Dec 05 20:16:56 crc kubenswrapper[4744]: I1205 20:16:56.087239 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98e5f65e-632c-4932-83cc-413ea5cac23a" path="/var/lib/kubelet/pods/98e5f65e-632c-4932-83cc-413ea5cac23a/volumes" Dec 05 20:16:56 crc kubenswrapper[4744]: I1205 20:16:56.493175 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mx8q8" event={"ID":"b7611a0a-e6bd-4051-bf2d-b3c28e86d91b","Type":"ContainerStarted","Data":"bba168a69ecee061fc57339674cc06b8c4817e61b11dc82004f15845375f6455"} Dec 05 20:16:56 crc kubenswrapper[4744]: I1205 20:16:56.522280 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mx8q8" podStartSLOduration=9.054576183 podStartE2EDuration="18.522265081s" podCreationTimestamp="2025-12-05 20:16:38 +0000 UTC" firstStartedPulling="2025-12-05 20:16:46.38011845 +0000 UTC m=+376.609929818" lastFinishedPulling="2025-12-05 20:16:55.847807348 +0000 UTC m=+386.077618716" observedRunningTime="2025-12-05 20:16:56.518924805 +0000 UTC m=+386.748736183" watchObservedRunningTime="2025-12-05 20:16:56.522265081 +0000 UTC m=+386.752076459" Dec 05 20:16:56 crc kubenswrapper[4744]: I1205 20:16:56.553145 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:16:56 crc kubenswrapper[4744]: I1205 20:16:56.599109 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:16:57 crc kubenswrapper[4744]: I1205 20:16:57.170381 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8x5wd" Dec 05 20:16:57 crc kubenswrapper[4744]: I1205 20:16:57.219119 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8x5wd" Dec 05 20:16:58 crc kubenswrapper[4744]: I1205 20:16:58.945765 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mx8q8" Dec 05 20:16:58 crc kubenswrapper[4744]: I1205 20:16:58.946183 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mx8q8" Dec 05 20:16:59 crc kubenswrapper[4744]: I1205 20:16:59.519234 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nqwbs" Dec 05 20:16:59 crc kubenswrapper[4744]: I1205 20:16:59.519328 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nqwbs" Dec 05 20:16:59 crc kubenswrapper[4744]: I1205 20:16:59.580938 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nqwbs" Dec 05 20:16:59 crc kubenswrapper[4744]: I1205 20:16:59.990387 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mx8q8" podUID="b7611a0a-e6bd-4051-bf2d-b3c28e86d91b" containerName="registry-server" probeResult="failure" output=< Dec 05 20:16:59 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Dec 05 20:16:59 crc kubenswrapper[4744]: > Dec 05 20:17:00 crc kubenswrapper[4744]: I1205 20:17:00.561921 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nqwbs" Dec 05 20:17:09 crc kubenswrapper[4744]: I1205 20:17:09.012110 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mx8q8" Dec 05 20:17:09 crc kubenswrapper[4744]: I1205 20:17:09.082883 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mx8q8" Dec 05 20:17:19 crc kubenswrapper[4744]: I1205 20:17:19.807447 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:17:19 crc kubenswrapper[4744]: I1205 20:17:19.808253 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:17:49 crc kubenswrapper[4744]: I1205 20:17:49.806881 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:17:49 crc kubenswrapper[4744]: I1205 20:17:49.807586 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:17:49 crc kubenswrapper[4744]: I1205 20:17:49.807667 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:17:49 crc kubenswrapper[4744]: I1205 20:17:49.808698 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9a82750b0c52c0406985c221fbcd15515963a387f45fb587ee836a849ecce2f"} pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:17:49 crc kubenswrapper[4744]: I1205 20:17:49.808805 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" containerID="cri-o://d9a82750b0c52c0406985c221fbcd15515963a387f45fb587ee836a849ecce2f" gracePeriod=600 Dec 05 20:17:50 crc kubenswrapper[4744]: I1205 20:17:50.865473 4744 generic.go:334] "Generic (PLEG): container finished" podID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerID="d9a82750b0c52c0406985c221fbcd15515963a387f45fb587ee836a849ecce2f" exitCode=0 Dec 05 20:17:50 crc kubenswrapper[4744]: I1205 20:17:50.865612 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerDied","Data":"d9a82750b0c52c0406985c221fbcd15515963a387f45fb587ee836a849ecce2f"} Dec 05 20:17:50 crc kubenswrapper[4744]: I1205 20:17:50.865922 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerStarted","Data":"9b8507204d764e1fcbea9060b778025759252a229905c4a16f53f059b113aeda"} Dec 05 20:17:50 crc kubenswrapper[4744]: I1205 20:17:50.865965 4744 scope.go:117] "RemoveContainer" containerID="121f7d712e41d02f802536a5b9eefd55ddfad47b064890bf83ec7191b6ea908b" Dec 05 20:20:19 crc kubenswrapper[4744]: I1205 20:20:19.807225 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:20:19 crc kubenswrapper[4744]: I1205 20:20:19.807943 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:20:49 crc kubenswrapper[4744]: I1205 20:20:49.806627 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:20:49 crc kubenswrapper[4744]: I1205 20:20:49.807154 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:21:19 crc kubenswrapper[4744]: I1205 20:21:19.807157 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:21:19 crc kubenswrapper[4744]: I1205 20:21:19.807936 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:21:19 crc kubenswrapper[4744]: I1205 20:21:19.808014 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:21:19 crc kubenswrapper[4744]: I1205 20:21:19.808897 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9b8507204d764e1fcbea9060b778025759252a229905c4a16f53f059b113aeda"} pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:21:19 crc kubenswrapper[4744]: I1205 20:21:19.809005 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" containerID="cri-o://9b8507204d764e1fcbea9060b778025759252a229905c4a16f53f059b113aeda" gracePeriod=600 Dec 05 20:21:20 crc kubenswrapper[4744]: I1205 20:21:20.304374 4744 generic.go:334] "Generic (PLEG): container finished" podID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerID="9b8507204d764e1fcbea9060b778025759252a229905c4a16f53f059b113aeda" exitCode=0 Dec 05 20:21:20 crc kubenswrapper[4744]: I1205 20:21:20.304600 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerDied","Data":"9b8507204d764e1fcbea9060b778025759252a229905c4a16f53f059b113aeda"} Dec 05 20:21:20 crc kubenswrapper[4744]: I1205 20:21:20.304687 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerStarted","Data":"fcebdaf5fbdada46a4c4fdee6dfda24df67a9ddab7d4a2219b461c1be76e2942"} Dec 05 20:21:20 crc kubenswrapper[4744]: I1205 20:21:20.304716 4744 scope.go:117] "RemoveContainer" containerID="d9a82750b0c52c0406985c221fbcd15515963a387f45fb587ee836a849ecce2f" Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.363106 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk"] Dec 05 20:22:25 crc kubenswrapper[4744]: E1205 20:22:25.364139 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e5f65e-632c-4932-83cc-413ea5cac23a" containerName="registry" Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.364163 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e5f65e-632c-4932-83cc-413ea5cac23a" containerName="registry" Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.364391 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e5f65e-632c-4932-83cc-413ea5cac23a" containerName="registry" Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.365823 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.368111 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.377936 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk"] Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.462427 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64773703-6ddb-4194-b745-6d130565fe68-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk\" (UID: \"64773703-6ddb-4194-b745-6d130565fe68\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.462491 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64773703-6ddb-4194-b745-6d130565fe68-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk\" (UID: \"64773703-6ddb-4194-b745-6d130565fe68\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.462531 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn5t8\" (UniqueName: \"kubernetes.io/projected/64773703-6ddb-4194-b745-6d130565fe68-kube-api-access-hn5t8\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk\" (UID: \"64773703-6ddb-4194-b745-6d130565fe68\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.563442 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64773703-6ddb-4194-b745-6d130565fe68-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk\" (UID: \"64773703-6ddb-4194-b745-6d130565fe68\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.563521 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64773703-6ddb-4194-b745-6d130565fe68-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk\" (UID: \"64773703-6ddb-4194-b745-6d130565fe68\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.563565 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn5t8\" (UniqueName: \"kubernetes.io/projected/64773703-6ddb-4194-b745-6d130565fe68-kube-api-access-hn5t8\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk\" (UID: \"64773703-6ddb-4194-b745-6d130565fe68\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.564106 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64773703-6ddb-4194-b745-6d130565fe68-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk\" (UID: \"64773703-6ddb-4194-b745-6d130565fe68\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.564217 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64773703-6ddb-4194-b745-6d130565fe68-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk\" (UID: \"64773703-6ddb-4194-b745-6d130565fe68\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.598754 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn5t8\" (UniqueName: \"kubernetes.io/projected/64773703-6ddb-4194-b745-6d130565fe68-kube-api-access-hn5t8\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk\" (UID: \"64773703-6ddb-4194-b745-6d130565fe68\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" Dec 05 20:22:25 crc kubenswrapper[4744]: I1205 20:22:25.683639 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" Dec 05 20:22:26 crc kubenswrapper[4744]: I1205 20:22:26.091158 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk"] Dec 05 20:22:26 crc kubenswrapper[4744]: I1205 20:22:26.729372 4744 generic.go:334] "Generic (PLEG): container finished" podID="64773703-6ddb-4194-b745-6d130565fe68" containerID="e7a8e316843d964e5ba93c8fa1dc614155d59db17be3b618ef0150c213e13b40" exitCode=0 Dec 05 20:22:26 crc kubenswrapper[4744]: I1205 20:22:26.729621 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" event={"ID":"64773703-6ddb-4194-b745-6d130565fe68","Type":"ContainerDied","Data":"e7a8e316843d964e5ba93c8fa1dc614155d59db17be3b618ef0150c213e13b40"} Dec 05 20:22:26 crc kubenswrapper[4744]: I1205 20:22:26.729653 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" event={"ID":"64773703-6ddb-4194-b745-6d130565fe68","Type":"ContainerStarted","Data":"4bf08e22289927866068087fbc4e70bae1ba19de625b318bc309b2a76060a9c5"} Dec 05 20:22:26 crc kubenswrapper[4744]: I1205 20:22:26.731532 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:22:29 crc kubenswrapper[4744]: I1205 20:22:29.747462 4744 generic.go:334] "Generic (PLEG): container finished" podID="64773703-6ddb-4194-b745-6d130565fe68" containerID="116809ac02951df3496eadc41bcbe3f4e5ed92c7ee3db69d6dfebec7f114a489" exitCode=0 Dec 05 20:22:29 crc kubenswrapper[4744]: I1205 20:22:29.747537 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" event={"ID":"64773703-6ddb-4194-b745-6d130565fe68","Type":"ContainerDied","Data":"116809ac02951df3496eadc41bcbe3f4e5ed92c7ee3db69d6dfebec7f114a489"} Dec 05 20:22:30 crc kubenswrapper[4744]: I1205 20:22:30.384971 4744 scope.go:117] "RemoveContainer" containerID="d3303cf6c938ba77cfa59df7fd22ef243b9ec76d38a7f2d989dbf604276b55b5" Dec 05 20:22:31 crc kubenswrapper[4744]: I1205 20:22:31.775943 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" event={"ID":"64773703-6ddb-4194-b745-6d130565fe68","Type":"ContainerStarted","Data":"4739f55ec5cbde49c29e98dc83760f39a6f81560d384ed9f5b36cfecab97442b"} Dec 05 20:22:31 crc kubenswrapper[4744]: I1205 20:22:31.806148 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" podStartSLOduration=4.638266972 podStartE2EDuration="6.806114988s" podCreationTimestamp="2025-12-05 20:22:25 +0000 UTC" firstStartedPulling="2025-12-05 20:22:26.731053486 +0000 UTC m=+716.960864874" lastFinishedPulling="2025-12-05 20:22:28.898901522 +0000 UTC m=+719.128712890" observedRunningTime="2025-12-05 20:22:31.799776547 +0000 UTC m=+722.029587935" watchObservedRunningTime="2025-12-05 20:22:31.806114988 +0000 UTC m=+722.035926396" Dec 05 20:22:32 crc kubenswrapper[4744]: I1205 20:22:32.782974 4744 generic.go:334] "Generic (PLEG): container finished" podID="64773703-6ddb-4194-b745-6d130565fe68" containerID="4739f55ec5cbde49c29e98dc83760f39a6f81560d384ed9f5b36cfecab97442b" exitCode=0 Dec 05 20:22:32 crc kubenswrapper[4744]: I1205 20:22:32.783136 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" event={"ID":"64773703-6ddb-4194-b745-6d130565fe68","Type":"ContainerDied","Data":"4739f55ec5cbde49c29e98dc83760f39a6f81560d384ed9f5b36cfecab97442b"} Dec 05 20:22:34 crc kubenswrapper[4744]: I1205 20:22:34.055410 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" Dec 05 20:22:34 crc kubenswrapper[4744]: I1205 20:22:34.171898 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn5t8\" (UniqueName: \"kubernetes.io/projected/64773703-6ddb-4194-b745-6d130565fe68-kube-api-access-hn5t8\") pod \"64773703-6ddb-4194-b745-6d130565fe68\" (UID: \"64773703-6ddb-4194-b745-6d130565fe68\") " Dec 05 20:22:34 crc kubenswrapper[4744]: I1205 20:22:34.171997 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64773703-6ddb-4194-b745-6d130565fe68-bundle\") pod \"64773703-6ddb-4194-b745-6d130565fe68\" (UID: \"64773703-6ddb-4194-b745-6d130565fe68\") " Dec 05 20:22:34 crc kubenswrapper[4744]: I1205 20:22:34.172097 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64773703-6ddb-4194-b745-6d130565fe68-util\") pod \"64773703-6ddb-4194-b745-6d130565fe68\" (UID: \"64773703-6ddb-4194-b745-6d130565fe68\") " Dec 05 20:22:34 crc kubenswrapper[4744]: I1205 20:22:34.173859 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64773703-6ddb-4194-b745-6d130565fe68-bundle" (OuterVolumeSpecName: "bundle") pod "64773703-6ddb-4194-b745-6d130565fe68" (UID: "64773703-6ddb-4194-b745-6d130565fe68"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:22:34 crc kubenswrapper[4744]: I1205 20:22:34.179934 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64773703-6ddb-4194-b745-6d130565fe68-kube-api-access-hn5t8" (OuterVolumeSpecName: "kube-api-access-hn5t8") pod "64773703-6ddb-4194-b745-6d130565fe68" (UID: "64773703-6ddb-4194-b745-6d130565fe68"). InnerVolumeSpecName "kube-api-access-hn5t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:22:34 crc kubenswrapper[4744]: I1205 20:22:34.196120 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64773703-6ddb-4194-b745-6d130565fe68-util" (OuterVolumeSpecName: "util") pod "64773703-6ddb-4194-b745-6d130565fe68" (UID: "64773703-6ddb-4194-b745-6d130565fe68"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:22:34 crc kubenswrapper[4744]: I1205 20:22:34.274406 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn5t8\" (UniqueName: \"kubernetes.io/projected/64773703-6ddb-4194-b745-6d130565fe68-kube-api-access-hn5t8\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:34 crc kubenswrapper[4744]: I1205 20:22:34.274468 4744 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64773703-6ddb-4194-b745-6d130565fe68-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:34 crc kubenswrapper[4744]: I1205 20:22:34.274487 4744 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64773703-6ddb-4194-b745-6d130565fe68-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:34 crc kubenswrapper[4744]: I1205 20:22:34.822138 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" event={"ID":"64773703-6ddb-4194-b745-6d130565fe68","Type":"ContainerDied","Data":"4bf08e22289927866068087fbc4e70bae1ba19de625b318bc309b2a76060a9c5"} Dec 05 20:22:34 crc kubenswrapper[4744]: I1205 20:22:34.822184 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bf08e22289927866068087fbc4e70bae1ba19de625b318bc309b2a76060a9c5" Dec 05 20:22:34 crc kubenswrapper[4744]: I1205 20:22:34.822202 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk" Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.444809 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6bk4n"] Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.445250 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovn-controller" containerID="cri-o://fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b" gracePeriod=30 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.445325 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="northd" containerID="cri-o://bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969" gracePeriod=30 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.445399 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b" gracePeriod=30 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.445419 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="sbdb" containerID="cri-o://efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915" gracePeriod=30 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.445450 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="kube-rbac-proxy-node" containerID="cri-o://6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015" gracePeriod=30 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.445485 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="nbdb" containerID="cri-o://9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7" gracePeriod=30 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.445489 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovn-acl-logging" containerID="cri-o://97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb" gracePeriod=30 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.504183 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" containerID="cri-o://82823500d1248bb0c059dbb22c93d962b48fbe35255bd4337304866b2a19b887" gracePeriod=30 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.832773 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7qlm7_89bdeba9-f644-4465-a9f8-82c682f6aea3/kube-multus/2.log" Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.833277 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7qlm7_89bdeba9-f644-4465-a9f8-82c682f6aea3/kube-multus/1.log" Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.833356 4744 generic.go:334] "Generic (PLEG): container finished" podID="89bdeba9-f644-4465-a9f8-82c682f6aea3" containerID="158a06cf97c1029c61e484aea0506a8356678a2eb865af54482cad3a1605bc60" exitCode=2 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.833395 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7qlm7" event={"ID":"89bdeba9-f644-4465-a9f8-82c682f6aea3","Type":"ContainerDied","Data":"158a06cf97c1029c61e484aea0506a8356678a2eb865af54482cad3a1605bc60"} Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.833445 4744 scope.go:117] "RemoveContainer" containerID="6ab97d51a3279ce570cf3560d86cc5052f5e9bbd25e84afcca05bcce623fc34c" Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.833902 4744 scope.go:117] "RemoveContainer" containerID="158a06cf97c1029c61e484aea0506a8356678a2eb865af54482cad3a1605bc60" Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.836760 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovnkube-controller/3.log" Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.858327 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovn-acl-logging/0.log" Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859004 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovn-controller/0.log" Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859612 4744 generic.go:334] "Generic (PLEG): container finished" podID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerID="82823500d1248bb0c059dbb22c93d962b48fbe35255bd4337304866b2a19b887" exitCode=0 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859633 4744 generic.go:334] "Generic (PLEG): container finished" podID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerID="efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915" exitCode=0 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859641 4744 generic.go:334] "Generic (PLEG): container finished" podID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerID="9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7" exitCode=0 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859647 4744 generic.go:334] "Generic (PLEG): container finished" podID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerID="bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969" exitCode=0 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859656 4744 generic.go:334] "Generic (PLEG): container finished" podID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerID="0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b" exitCode=0 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859663 4744 generic.go:334] "Generic (PLEG): container finished" podID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerID="6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015" exitCode=0 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859670 4744 generic.go:334] "Generic (PLEG): container finished" podID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerID="97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb" exitCode=143 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859676 4744 generic.go:334] "Generic (PLEG): container finished" podID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerID="fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b" exitCode=143 Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859693 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerDied","Data":"82823500d1248bb0c059dbb22c93d962b48fbe35255bd4337304866b2a19b887"} Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859736 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerDied","Data":"efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915"} Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859752 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerDied","Data":"9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7"} Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859765 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerDied","Data":"bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969"} Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859778 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerDied","Data":"0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b"} Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859792 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerDied","Data":"6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015"} Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859806 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerDied","Data":"97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb"} Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.859819 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerDied","Data":"fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b"} Dec 05 20:22:36 crc kubenswrapper[4744]: I1205 20:22:36.866565 4744 scope.go:117] "RemoveContainer" containerID="c76f057fbbd159859c2e61f4c1c474d846b5e243375b35940fc12a4735d8b5e9" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.116900 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovn-acl-logging/0.log" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.127262 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovn-controller/0.log" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.130216 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193445 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m5crs"] Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193674 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="nbdb" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193693 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="nbdb" Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193709 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193718 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193728 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64773703-6ddb-4194-b745-6d130565fe68" containerName="util" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193737 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="64773703-6ddb-4194-b745-6d130565fe68" containerName="util" Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193746 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="kubecfg-setup" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193755 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="kubecfg-setup" Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193769 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193779 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193788 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193796 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193807 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="sbdb" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193814 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="sbdb" Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193823 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193831 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193840 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovn-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193848 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovn-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193861 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64773703-6ddb-4194-b745-6d130565fe68" containerName="extract" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193868 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="64773703-6ddb-4194-b745-6d130565fe68" containerName="extract" Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193880 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64773703-6ddb-4194-b745-6d130565fe68" containerName="pull" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193887 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="64773703-6ddb-4194-b745-6d130565fe68" containerName="pull" Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193898 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="kube-rbac-proxy-node" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193905 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="kube-rbac-proxy-node" Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193914 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193923 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193934 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193942 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193954 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovn-acl-logging" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193961 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovn-acl-logging" Dec 05 20:22:37 crc kubenswrapper[4744]: E1205 20:22:37.193971 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="northd" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.193978 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="northd" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.194089 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="sbdb" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.194099 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.194108 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovn-acl-logging" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.194123 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.194131 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="64773703-6ddb-4194-b745-6d130565fe68" containerName="extract" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.194139 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.194147 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="northd" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.194156 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.194166 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="kube-rbac-proxy-node" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.194174 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovn-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.194183 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="nbdb" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.194438 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.194647 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" containerName="ovnkube-controller" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.196387 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212333 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovnkube-config\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212378 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-run-ovn-kubernetes\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212416 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-openvswitch\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212433 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-kubelet\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212453 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-env-overrides\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212452 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212481 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-systemd\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212502 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-cni-bin\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212506 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212525 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-slash\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212544 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-log-socket\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212569 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovn-node-metrics-cert\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212557 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212589 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovnkube-script-lib\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212584 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212616 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-log-socket" (OuterVolumeSpecName: "log-socket") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212608 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-run-netns\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212607 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-slash" (OuterVolumeSpecName: "host-slash") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212640 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212677 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-var-lib-openvswitch\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212710 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-ovn\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212734 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-systemd-units\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212749 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212764 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-cni-netd\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212792 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-node-log\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212822 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-etc-openvswitch\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212773 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212853 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212889 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97hdl\" (UniqueName: \"kubernetes.io/projected/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-kube-api-access-97hdl\") pod \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\" (UID: \"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6\") " Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212810 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212831 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-node-log" (OuterVolumeSpecName: "node-log") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212845 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212852 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212867 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212871 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212890 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.212968 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213279 4744 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213315 4744 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213328 4744 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213339 4744 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213350 4744 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213361 4744 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213371 4744 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213381 4744 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213391 4744 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-slash\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213402 4744 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-log-socket\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213413 4744 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213423 4744 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213434 4744 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213444 4744 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213454 4744 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213464 4744 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.213474 4744 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-node-log\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.220877 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-kube-api-access-97hdl" (OuterVolumeSpecName: "kube-api-access-97hdl") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "kube-api-access-97hdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.225925 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.228878 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" (UID: "99bea8e6-6eff-4db0-8e98-20a5ae64e0d6"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314383 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-run-systemd\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314446 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-var-lib-openvswitch\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314470 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-run-ovn\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314492 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5d75c040-89a0-4ce7-8991-49f58f7dd168-ovnkube-config\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314578 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-slash\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314625 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5d75c040-89a0-4ce7-8991-49f58f7dd168-ovnkube-script-lib\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314665 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5d75c040-89a0-4ce7-8991-49f58f7dd168-ovn-node-metrics-cert\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314690 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8fvx\" (UniqueName: \"kubernetes.io/projected/5d75c040-89a0-4ce7-8991-49f58f7dd168-kube-api-access-j8fvx\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314712 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-systemd-units\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314731 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-run-openvswitch\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314752 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-log-socket\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314772 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-run-ovn-kubernetes\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314816 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5d75c040-89a0-4ce7-8991-49f58f7dd168-env-overrides\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314843 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-node-log\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314867 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-run-netns\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314898 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-etc-openvswitch\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314922 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314955 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-kubelet\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.314998 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-cni-bin\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.315019 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-cni-netd\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.315165 4744 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.315194 4744 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.315208 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97hdl\" (UniqueName: \"kubernetes.io/projected/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6-kube-api-access-97hdl\") on node \"crc\" DevicePath \"\"" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.415750 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-cni-bin\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.415784 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-cni-netd\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.415813 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-run-systemd\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.415827 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-var-lib-openvswitch\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.415846 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-run-ovn\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.415871 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5d75c040-89a0-4ce7-8991-49f58f7dd168-ovnkube-config\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.415897 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-slash\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.415901 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-cni-netd\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.415919 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5d75c040-89a0-4ce7-8991-49f58f7dd168-ovnkube-script-lib\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.415923 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-cni-bin\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.415972 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-var-lib-openvswitch\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.415972 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-run-systemd\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416030 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-slash\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416042 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5d75c040-89a0-4ce7-8991-49f58f7dd168-ovn-node-metrics-cert\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416170 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-run-ovn\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416196 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8fvx\" (UniqueName: \"kubernetes.io/projected/5d75c040-89a0-4ce7-8991-49f58f7dd168-kube-api-access-j8fvx\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416231 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-systemd-units\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416252 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-run-ovn-kubernetes\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416268 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-run-openvswitch\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416282 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-log-socket\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416329 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-run-openvswitch\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416335 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-run-ovn-kubernetes\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416347 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5d75c040-89a0-4ce7-8991-49f58f7dd168-env-overrides\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416367 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-log-socket\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416281 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-systemd-units\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416403 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-node-log\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416429 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-run-netns\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416462 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-etc-openvswitch\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416483 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-node-log\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416488 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416515 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-kubelet\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416519 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-run-netns\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416577 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416582 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-host-kubelet\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416604 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5d75c040-89a0-4ce7-8991-49f58f7dd168-etc-openvswitch\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416798 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5d75c040-89a0-4ce7-8991-49f58f7dd168-ovnkube-config\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416828 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5d75c040-89a0-4ce7-8991-49f58f7dd168-env-overrides\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.416910 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5d75c040-89a0-4ce7-8991-49f58f7dd168-ovnkube-script-lib\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.420618 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5d75c040-89a0-4ce7-8991-49f58f7dd168-ovn-node-metrics-cert\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.454996 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8fvx\" (UniqueName: \"kubernetes.io/projected/5d75c040-89a0-4ce7-8991-49f58f7dd168-kube-api-access-j8fvx\") pod \"ovnkube-node-m5crs\" (UID: \"5d75c040-89a0-4ce7-8991-49f58f7dd168\") " pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.508517 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:37 crc kubenswrapper[4744]: W1205 20:22:37.528620 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d75c040_89a0_4ce7_8991_49f58f7dd168.slice/crio-148981fe6cc36713f9fb9fead8806cd888e59e90a4ee20c6dd45aea9237aa0e9 WatchSource:0}: Error finding container 148981fe6cc36713f9fb9fead8806cd888e59e90a4ee20c6dd45aea9237aa0e9: Status 404 returned error can't find the container with id 148981fe6cc36713f9fb9fead8806cd888e59e90a4ee20c6dd45aea9237aa0e9 Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.868418 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovn-acl-logging/0.log" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.869101 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bk4n_99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/ovn-controller/0.log" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.869516 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" event={"ID":"99bea8e6-6eff-4db0-8e98-20a5ae64e0d6","Type":"ContainerDied","Data":"93d028c9806d6ee200f9c1442800c265097fab978e5e2daad308c1acffa58359"} Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.869581 4744 scope.go:117] "RemoveContainer" containerID="82823500d1248bb0c059dbb22c93d962b48fbe35255bd4337304866b2a19b887" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.869532 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bk4n" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.871548 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7qlm7_89bdeba9-f644-4465-a9f8-82c682f6aea3/kube-multus/2.log" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.871633 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7qlm7" event={"ID":"89bdeba9-f644-4465-a9f8-82c682f6aea3","Type":"ContainerStarted","Data":"19ff0f249f2542dbade4407fd38ffd7a9213599d65b0f8378d8a392f82316479"} Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.873078 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" event={"ID":"5d75c040-89a0-4ce7-8991-49f58f7dd168","Type":"ContainerStarted","Data":"148981fe6cc36713f9fb9fead8806cd888e59e90a4ee20c6dd45aea9237aa0e9"} Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.887468 4744 scope.go:117] "RemoveContainer" containerID="efe396b2630a056484e95a4c6ac31f6ab3ed0a02885638d333fec828bf9d8915" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.902437 4744 scope.go:117] "RemoveContainer" containerID="9e8c9b2aba95f738f03f4c51f2770e001e2ccc8ba879ebd20ed2a804e133b2e7" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.919572 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6bk4n"] Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.929073 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6bk4n"] Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.929242 4744 scope.go:117] "RemoveContainer" containerID="bc3d371181246a09d9cb7e4b1c4d943d293d324db7817e6602f3a2714dc1a969" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.948849 4744 scope.go:117] "RemoveContainer" containerID="0ca47ffde15d8f48fd974f421ca950f41fbcf68843ea26d819f629adc2645d3b" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.961950 4744 scope.go:117] "RemoveContainer" containerID="6285a15b8a27d54f6b4f49cb874f4c563b5eeec5a3bf79471af90cdfdc191015" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.976208 4744 scope.go:117] "RemoveContainer" containerID="97de484119eed7dea902eb2896b3f86e28b2e2cf600b5d1af8388bcb9fcea1cb" Dec 05 20:22:37 crc kubenswrapper[4744]: I1205 20:22:37.993358 4744 scope.go:117] "RemoveContainer" containerID="fa93d6f01d966609ba63ee6852bd89aa1c4d24988cd26e5da767d20ddc47c45b" Dec 05 20:22:38 crc kubenswrapper[4744]: I1205 20:22:38.008947 4744 scope.go:117] "RemoveContainer" containerID="a4d49fb1e297c30bf979a7048c4ea5e0ac7a5643d984b8b59ba2e13eb169771a" Dec 05 20:22:38 crc kubenswrapper[4744]: I1205 20:22:38.096896 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99bea8e6-6eff-4db0-8e98-20a5ae64e0d6" path="/var/lib/kubelet/pods/99bea8e6-6eff-4db0-8e98-20a5ae64e0d6/volumes" Dec 05 20:22:38 crc kubenswrapper[4744]: I1205 20:22:38.879177 4744 generic.go:334] "Generic (PLEG): container finished" podID="5d75c040-89a0-4ce7-8991-49f58f7dd168" containerID="c6e93510a073781a9f741628ca36c9f393f85e2a888eeb85dc43480dbcad5b11" exitCode=0 Dec 05 20:22:38 crc kubenswrapper[4744]: I1205 20:22:38.879262 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" event={"ID":"5d75c040-89a0-4ce7-8991-49f58f7dd168","Type":"ContainerDied","Data":"c6e93510a073781a9f741628ca36c9f393f85e2a888eeb85dc43480dbcad5b11"} Dec 05 20:22:39 crc kubenswrapper[4744]: I1205 20:22:39.887082 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" event={"ID":"5d75c040-89a0-4ce7-8991-49f58f7dd168","Type":"ContainerStarted","Data":"3f471a35de9d8e612d9f0230a90ba1689ffca7eb87d8d06efe7cd60474af9649"} Dec 05 20:22:39 crc kubenswrapper[4744]: I1205 20:22:39.887424 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" event={"ID":"5d75c040-89a0-4ce7-8991-49f58f7dd168","Type":"ContainerStarted","Data":"301983f7e7fc1de6523db0705c9acf0ab8c4f9173df667a641bef7e9f30a0c36"} Dec 05 20:22:39 crc kubenswrapper[4744]: I1205 20:22:39.887437 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" event={"ID":"5d75c040-89a0-4ce7-8991-49f58f7dd168","Type":"ContainerStarted","Data":"2216fc6c3a00b83d83e1bdcc1f70dd514ba424d0f6bbf21d647a97a1031def7b"} Dec 05 20:22:39 crc kubenswrapper[4744]: I1205 20:22:39.887448 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" event={"ID":"5d75c040-89a0-4ce7-8991-49f58f7dd168","Type":"ContainerStarted","Data":"fae60521c363b2878827bca8ff6505adcb4c49f029573280e58332575a826b24"} Dec 05 20:22:39 crc kubenswrapper[4744]: I1205 20:22:39.887457 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" event={"ID":"5d75c040-89a0-4ce7-8991-49f58f7dd168","Type":"ContainerStarted","Data":"b02bc2c37c0807e1637d7c9d074eabab437425166f0e0fb736bb74e704ad3211"} Dec 05 20:22:39 crc kubenswrapper[4744]: I1205 20:22:39.887466 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" event={"ID":"5d75c040-89a0-4ce7-8991-49f58f7dd168","Type":"ContainerStarted","Data":"0af82f5db8c25c51bc77cadcfb5276dd1ea06669995ff1ab582176260dc5f0f0"} Dec 05 20:22:42 crc kubenswrapper[4744]: I1205 20:22:42.917805 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" event={"ID":"5d75c040-89a0-4ce7-8991-49f58f7dd168","Type":"ContainerStarted","Data":"9ddc00bf108e104bff31ad8e9e7e4f3292ae763cf137b72376dd41317db89ee2"} Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.090151 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf"] Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.090878 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.092735 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.093834 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.094339 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-85k7s" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.178348 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6gkn\" (UniqueName: \"kubernetes.io/projected/d6cb3e32-cc6f-4091-ae30-5de5790d952c-kube-api-access-r6gkn\") pod \"obo-prometheus-operator-668cf9dfbb-6vnbf\" (UID: \"d6cb3e32-cc6f-4091-ae30-5de5790d952c\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.208045 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n"] Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.208700 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.210396 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-7f6ww" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.210536 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.222646 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8"] Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.223373 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.279076 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6gkn\" (UniqueName: \"kubernetes.io/projected/d6cb3e32-cc6f-4091-ae30-5de5790d952c-kube-api-access-r6gkn\") pod \"obo-prometheus-operator-668cf9dfbb-6vnbf\" (UID: \"d6cb3e32-cc6f-4091-ae30-5de5790d952c\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.279136 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a755ef85-a445-4cd1-bb7b-4bea0bb7b796-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n\" (UID: \"a755ef85-a445-4cd1-bb7b-4bea0bb7b796\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.279188 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a755ef85-a445-4cd1-bb7b-4bea0bb7b796-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n\" (UID: \"a755ef85-a445-4cd1-bb7b-4bea0bb7b796\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.311817 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6gkn\" (UniqueName: \"kubernetes.io/projected/d6cb3e32-cc6f-4091-ae30-5de5790d952c-kube-api-access-r6gkn\") pod \"obo-prometheus-operator-668cf9dfbb-6vnbf\" (UID: \"d6cb3e32-cc6f-4091-ae30-5de5790d952c\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.380518 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a755ef85-a445-4cd1-bb7b-4bea0bb7b796-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n\" (UID: \"a755ef85-a445-4cd1-bb7b-4bea0bb7b796\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.380607 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a755ef85-a445-4cd1-bb7b-4bea0bb7b796-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n\" (UID: \"a755ef85-a445-4cd1-bb7b-4bea0bb7b796\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.380668 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/73c8ea4a-800d-4d94-9732-f81484c43481-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8\" (UID: \"73c8ea4a-800d-4d94-9732-f81484c43481\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.380708 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/73c8ea4a-800d-4d94-9732-f81484c43481-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8\" (UID: \"73c8ea4a-800d-4d94-9732-f81484c43481\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.384199 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a755ef85-a445-4cd1-bb7b-4bea0bb7b796-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n\" (UID: \"a755ef85-a445-4cd1-bb7b-4bea0bb7b796\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.384453 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a755ef85-a445-4cd1-bb7b-4bea0bb7b796-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n\" (UID: \"a755ef85-a445-4cd1-bb7b-4bea0bb7b796\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.409902 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.445919 4744 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-6vnbf_openshift-operators_d6cb3e32-cc6f-4091-ae30-5de5790d952c_0(0c5035a7bacd6952d13482327f1e61b5f8a2d36a741d1235ad07fe38d4dbdaaa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.446115 4744 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-6vnbf_openshift-operators_d6cb3e32-cc6f-4091-ae30-5de5790d952c_0(0c5035a7bacd6952d13482327f1e61b5f8a2d36a741d1235ad07fe38d4dbdaaa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.446208 4744 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-6vnbf_openshift-operators_d6cb3e32-cc6f-4091-ae30-5de5790d952c_0(0c5035a7bacd6952d13482327f1e61b5f8a2d36a741d1235ad07fe38d4dbdaaa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.446349 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-6vnbf_openshift-operators(d6cb3e32-cc6f-4091-ae30-5de5790d952c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-6vnbf_openshift-operators(d6cb3e32-cc6f-4091-ae30-5de5790d952c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-6vnbf_openshift-operators_d6cb3e32-cc6f-4091-ae30-5de5790d952c_0(0c5035a7bacd6952d13482327f1e61b5f8a2d36a741d1235ad07fe38d4dbdaaa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" podUID="d6cb3e32-cc6f-4091-ae30-5de5790d952c" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.482337 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/73c8ea4a-800d-4d94-9732-f81484c43481-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8\" (UID: \"73c8ea4a-800d-4d94-9732-f81484c43481\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.482981 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/73c8ea4a-800d-4d94-9732-f81484c43481-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8\" (UID: \"73c8ea4a-800d-4d94-9732-f81484c43481\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.487044 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/73c8ea4a-800d-4d94-9732-f81484c43481-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8\" (UID: \"73c8ea4a-800d-4d94-9732-f81484c43481\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.487802 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/73c8ea4a-800d-4d94-9732-f81484c43481-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8\" (UID: \"73c8ea4a-800d-4d94-9732-f81484c43481\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.503831 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-nlfb9"] Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.504652 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.509428 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-7llkg" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.510018 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.520534 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.535207 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.541265 4744 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n_openshift-operators_a755ef85-a445-4cd1-bb7b-4bea0bb7b796_0(c4b24275ee288164c6e400984bb0aa0705916f6bccd22632eed520d0583c4dd9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.541340 4744 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n_openshift-operators_a755ef85-a445-4cd1-bb7b-4bea0bb7b796_0(c4b24275ee288164c6e400984bb0aa0705916f6bccd22632eed520d0583c4dd9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.541363 4744 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n_openshift-operators_a755ef85-a445-4cd1-bb7b-4bea0bb7b796_0(c4b24275ee288164c6e400984bb0aa0705916f6bccd22632eed520d0583c4dd9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.541409 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n_openshift-operators(a755ef85-a445-4cd1-bb7b-4bea0bb7b796)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n_openshift-operators(a755ef85-a445-4cd1-bb7b-4bea0bb7b796)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n_openshift-operators_a755ef85-a445-4cd1-bb7b-4bea0bb7b796_0(c4b24275ee288164c6e400984bb0aa0705916f6bccd22632eed520d0583c4dd9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" podUID="a755ef85-a445-4cd1-bb7b-4bea0bb7b796" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.583231 4744 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8_openshift-operators_73c8ea4a-800d-4d94-9732-f81484c43481_0(a05e4aedfe9c9d32e040e2fdd27279ffa17a0b8cc97be156bd6da3904719d331): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.583315 4744 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8_openshift-operators_73c8ea4a-800d-4d94-9732-f81484c43481_0(a05e4aedfe9c9d32e040e2fdd27279ffa17a0b8cc97be156bd6da3904719d331): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.583338 4744 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8_openshift-operators_73c8ea4a-800d-4d94-9732-f81484c43481_0(a05e4aedfe9c9d32e040e2fdd27279ffa17a0b8cc97be156bd6da3904719d331): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.583389 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8_openshift-operators(73c8ea4a-800d-4d94-9732-f81484c43481)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8_openshift-operators(73c8ea4a-800d-4d94-9732-f81484c43481)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8_openshift-operators_73c8ea4a-800d-4d94-9732-f81484c43481_0(a05e4aedfe9c9d32e040e2fdd27279ffa17a0b8cc97be156bd6da3904719d331): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" podUID="73c8ea4a-800d-4d94-9732-f81484c43481" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.583708 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47lhg\" (UniqueName: \"kubernetes.io/projected/f152504a-5f82-434d-904e-b9e1f2e49a5e-kube-api-access-47lhg\") pod \"observability-operator-d8bb48f5d-nlfb9\" (UID: \"f152504a-5f82-434d-904e-b9e1f2e49a5e\") " pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.583763 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f152504a-5f82-434d-904e-b9e1f2e49a5e-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-nlfb9\" (UID: \"f152504a-5f82-434d-904e-b9e1f2e49a5e\") " pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.684747 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47lhg\" (UniqueName: \"kubernetes.io/projected/f152504a-5f82-434d-904e-b9e1f2e49a5e-kube-api-access-47lhg\") pod \"observability-operator-d8bb48f5d-nlfb9\" (UID: \"f152504a-5f82-434d-904e-b9e1f2e49a5e\") " pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.684853 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f152504a-5f82-434d-904e-b9e1f2e49a5e-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-nlfb9\" (UID: \"f152504a-5f82-434d-904e-b9e1f2e49a5e\") " pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.690723 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f152504a-5f82-434d-904e-b9e1f2e49a5e-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-nlfb9\" (UID: \"f152504a-5f82-434d-904e-b9e1f2e49a5e\") " pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.710025 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47lhg\" (UniqueName: \"kubernetes.io/projected/f152504a-5f82-434d-904e-b9e1f2e49a5e-kube-api-access-47lhg\") pod \"observability-operator-d8bb48f5d-nlfb9\" (UID: \"f152504a-5f82-434d-904e-b9e1f2e49a5e\") " pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.765900 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-52fcv"] Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.766556 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.768106 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-94qb4" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.820721 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.843277 4744 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-nlfb9_openshift-operators_f152504a-5f82-434d-904e-b9e1f2e49a5e_0(52fb58dd8180007b26d53821f5448b8a9455bb0f104cefffc6342f4cbac84986): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.843368 4744 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-nlfb9_openshift-operators_f152504a-5f82-434d-904e-b9e1f2e49a5e_0(52fb58dd8180007b26d53821f5448b8a9455bb0f104cefffc6342f4cbac84986): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.843400 4744 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-nlfb9_openshift-operators_f152504a-5f82-434d-904e-b9e1f2e49a5e_0(52fb58dd8180007b26d53821f5448b8a9455bb0f104cefffc6342f4cbac84986): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:22:43 crc kubenswrapper[4744]: E1205 20:22:43.843462 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-nlfb9_openshift-operators(f152504a-5f82-434d-904e-b9e1f2e49a5e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-nlfb9_openshift-operators(f152504a-5f82-434d-904e-b9e1f2e49a5e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-nlfb9_openshift-operators_f152504a-5f82-434d-904e-b9e1f2e49a5e_0(52fb58dd8180007b26d53821f5448b8a9455bb0f104cefffc6342f4cbac84986): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" podUID="f152504a-5f82-434d-904e-b9e1f2e49a5e" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.887937 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/db4f5a8a-a57f-4988-8b36-f8926084fce9-openshift-service-ca\") pod \"perses-operator-5446b9c989-52fcv\" (UID: \"db4f5a8a-a57f-4988-8b36-f8926084fce9\") " pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.887979 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw24z\" (UniqueName: \"kubernetes.io/projected/db4f5a8a-a57f-4988-8b36-f8926084fce9-kube-api-access-mw24z\") pod \"perses-operator-5446b9c989-52fcv\" (UID: \"db4f5a8a-a57f-4988-8b36-f8926084fce9\") " pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.988831 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/db4f5a8a-a57f-4988-8b36-f8926084fce9-openshift-service-ca\") pod \"perses-operator-5446b9c989-52fcv\" (UID: \"db4f5a8a-a57f-4988-8b36-f8926084fce9\") " pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.988873 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw24z\" (UniqueName: \"kubernetes.io/projected/db4f5a8a-a57f-4988-8b36-f8926084fce9-kube-api-access-mw24z\") pod \"perses-operator-5446b9c989-52fcv\" (UID: \"db4f5a8a-a57f-4988-8b36-f8926084fce9\") " pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:43 crc kubenswrapper[4744]: I1205 20:22:43.989709 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/db4f5a8a-a57f-4988-8b36-f8926084fce9-openshift-service-ca\") pod \"perses-operator-5446b9c989-52fcv\" (UID: \"db4f5a8a-a57f-4988-8b36-f8926084fce9\") " pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:44 crc kubenswrapper[4744]: I1205 20:22:44.004879 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw24z\" (UniqueName: \"kubernetes.io/projected/db4f5a8a-a57f-4988-8b36-f8926084fce9-kube-api-access-mw24z\") pod \"perses-operator-5446b9c989-52fcv\" (UID: \"db4f5a8a-a57f-4988-8b36-f8926084fce9\") " pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:44 crc kubenswrapper[4744]: I1205 20:22:44.080664 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:44 crc kubenswrapper[4744]: E1205 20:22:44.110552 4744 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-52fcv_openshift-operators_db4f5a8a-a57f-4988-8b36-f8926084fce9_0(4bd5d0f0ec2359d425924c0c6cb9a5e350f0985778bbdc444b2f459530ce2fc9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:22:44 crc kubenswrapper[4744]: E1205 20:22:44.110611 4744 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-52fcv_openshift-operators_db4f5a8a-a57f-4988-8b36-f8926084fce9_0(4bd5d0f0ec2359d425924c0c6cb9a5e350f0985778bbdc444b2f459530ce2fc9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:44 crc kubenswrapper[4744]: E1205 20:22:44.110635 4744 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-52fcv_openshift-operators_db4f5a8a-a57f-4988-8b36-f8926084fce9_0(4bd5d0f0ec2359d425924c0c6cb9a5e350f0985778bbdc444b2f459530ce2fc9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:44 crc kubenswrapper[4744]: E1205 20:22:44.110680 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-52fcv_openshift-operators(db4f5a8a-a57f-4988-8b36-f8926084fce9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-52fcv_openshift-operators(db4f5a8a-a57f-4988-8b36-f8926084fce9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-52fcv_openshift-operators_db4f5a8a-a57f-4988-8b36-f8926084fce9_0(4bd5d0f0ec2359d425924c0c6cb9a5e350f0985778bbdc444b2f459530ce2fc9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-52fcv" podUID="db4f5a8a-a57f-4988-8b36-f8926084fce9" Dec 05 20:22:45 crc kubenswrapper[4744]: I1205 20:22:45.940254 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" event={"ID":"5d75c040-89a0-4ce7-8991-49f58f7dd168","Type":"ContainerStarted","Data":"1c90e275235fc00ce3af3943e5166e0f7d68d15dfc98052e37452bf0dd234e21"} Dec 05 20:22:45 crc kubenswrapper[4744]: I1205 20:22:45.940624 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:45 crc kubenswrapper[4744]: I1205 20:22:45.940641 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:45 crc kubenswrapper[4744]: I1205 20:22:45.940653 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:45 crc kubenswrapper[4744]: I1205 20:22:45.970857 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:45 crc kubenswrapper[4744]: I1205 20:22:45.973762 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" podStartSLOduration=8.973751407 podStartE2EDuration="8.973751407s" podCreationTimestamp="2025-12-05 20:22:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:22:45.972088278 +0000 UTC m=+736.201899656" watchObservedRunningTime="2025-12-05 20:22:45.973751407 +0000 UTC m=+736.203562775" Dec 05 20:22:45 crc kubenswrapper[4744]: I1205 20:22:45.986103 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:22:46 crc kubenswrapper[4744]: I1205 20:22:46.772994 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-52fcv"] Dec 05 20:22:46 crc kubenswrapper[4744]: I1205 20:22:46.773431 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:46 crc kubenswrapper[4744]: I1205 20:22:46.773853 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.800616 4744 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-52fcv_openshift-operators_db4f5a8a-a57f-4988-8b36-f8926084fce9_0(ae19f1613c1305844d8bf631a989e3734d0677648b4f240f000f2a0d3670cd43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.800688 4744 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-52fcv_openshift-operators_db4f5a8a-a57f-4988-8b36-f8926084fce9_0(ae19f1613c1305844d8bf631a989e3734d0677648b4f240f000f2a0d3670cd43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.800712 4744 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-52fcv_openshift-operators_db4f5a8a-a57f-4988-8b36-f8926084fce9_0(ae19f1613c1305844d8bf631a989e3734d0677648b4f240f000f2a0d3670cd43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.800760 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-52fcv_openshift-operators(db4f5a8a-a57f-4988-8b36-f8926084fce9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-52fcv_openshift-operators(db4f5a8a-a57f-4988-8b36-f8926084fce9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-52fcv_openshift-operators_db4f5a8a-a57f-4988-8b36-f8926084fce9_0(ae19f1613c1305844d8bf631a989e3734d0677648b4f240f000f2a0d3670cd43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-52fcv" podUID="db4f5a8a-a57f-4988-8b36-f8926084fce9" Dec 05 20:22:46 crc kubenswrapper[4744]: I1205 20:22:46.801028 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8"] Dec 05 20:22:46 crc kubenswrapper[4744]: I1205 20:22:46.801057 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf"] Dec 05 20:22:46 crc kubenswrapper[4744]: I1205 20:22:46.801128 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" Dec 05 20:22:46 crc kubenswrapper[4744]: I1205 20:22:46.801592 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" Dec 05 20:22:46 crc kubenswrapper[4744]: I1205 20:22:46.801874 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:46 crc kubenswrapper[4744]: I1205 20:22:46.802106 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:46 crc kubenswrapper[4744]: I1205 20:22:46.807978 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-nlfb9"] Dec 05 20:22:46 crc kubenswrapper[4744]: I1205 20:22:46.808080 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:22:46 crc kubenswrapper[4744]: I1205 20:22:46.808437 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:22:46 crc kubenswrapper[4744]: I1205 20:22:46.875625 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n"] Dec 05 20:22:46 crc kubenswrapper[4744]: I1205 20:22:46.875731 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:22:46 crc kubenswrapper[4744]: I1205 20:22:46.876066 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.902421 4744 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-nlfb9_openshift-operators_f152504a-5f82-434d-904e-b9e1f2e49a5e_0(aaca1778ba6b8c50b22a8e97d78464d637d5b28fced3e9b0bef9a1c406cf6259): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.902481 4744 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-nlfb9_openshift-operators_f152504a-5f82-434d-904e-b9e1f2e49a5e_0(aaca1778ba6b8c50b22a8e97d78464d637d5b28fced3e9b0bef9a1c406cf6259): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.902509 4744 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-nlfb9_openshift-operators_f152504a-5f82-434d-904e-b9e1f2e49a5e_0(aaca1778ba6b8c50b22a8e97d78464d637d5b28fced3e9b0bef9a1c406cf6259): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.902550 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-nlfb9_openshift-operators(f152504a-5f82-434d-904e-b9e1f2e49a5e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-nlfb9_openshift-operators(f152504a-5f82-434d-904e-b9e1f2e49a5e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-nlfb9_openshift-operators_f152504a-5f82-434d-904e-b9e1f2e49a5e_0(aaca1778ba6b8c50b22a8e97d78464d637d5b28fced3e9b0bef9a1c406cf6259): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" podUID="f152504a-5f82-434d-904e-b9e1f2e49a5e" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.902452 4744 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-6vnbf_openshift-operators_d6cb3e32-cc6f-4091-ae30-5de5790d952c_0(bccf49cb57bb331f95e1c1b8d4cb82c226daded54311950761835f7bbb5d3c25): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.902654 4744 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-6vnbf_openshift-operators_d6cb3e32-cc6f-4091-ae30-5de5790d952c_0(bccf49cb57bb331f95e1c1b8d4cb82c226daded54311950761835f7bbb5d3c25): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.902677 4744 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-6vnbf_openshift-operators_d6cb3e32-cc6f-4091-ae30-5de5790d952c_0(bccf49cb57bb331f95e1c1b8d4cb82c226daded54311950761835f7bbb5d3c25): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.902725 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-6vnbf_openshift-operators(d6cb3e32-cc6f-4091-ae30-5de5790d952c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-6vnbf_openshift-operators(d6cb3e32-cc6f-4091-ae30-5de5790d952c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-6vnbf_openshift-operators_d6cb3e32-cc6f-4091-ae30-5de5790d952c_0(bccf49cb57bb331f95e1c1b8d4cb82c226daded54311950761835f7bbb5d3c25): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" podUID="d6cb3e32-cc6f-4091-ae30-5de5790d952c" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.914368 4744 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8_openshift-operators_73c8ea4a-800d-4d94-9732-f81484c43481_0(ee0c8dcb1b0418aafac59d823b536ee3a26744e666942dd9d1023196edbdc5bb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.914454 4744 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8_openshift-operators_73c8ea4a-800d-4d94-9732-f81484c43481_0(ee0c8dcb1b0418aafac59d823b536ee3a26744e666942dd9d1023196edbdc5bb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.914479 4744 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8_openshift-operators_73c8ea4a-800d-4d94-9732-f81484c43481_0(ee0c8dcb1b0418aafac59d823b536ee3a26744e666942dd9d1023196edbdc5bb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.914532 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8_openshift-operators(73c8ea4a-800d-4d94-9732-f81484c43481)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8_openshift-operators(73c8ea4a-800d-4d94-9732-f81484c43481)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8_openshift-operators_73c8ea4a-800d-4d94-9732-f81484c43481_0(ee0c8dcb1b0418aafac59d823b536ee3a26744e666942dd9d1023196edbdc5bb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" podUID="73c8ea4a-800d-4d94-9732-f81484c43481" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.933269 4744 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n_openshift-operators_a755ef85-a445-4cd1-bb7b-4bea0bb7b796_0(7e42497f1c7144fb0e140a0b400cebcec7e47bf813767f934fcc6820bccf9935): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.933349 4744 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n_openshift-operators_a755ef85-a445-4cd1-bb7b-4bea0bb7b796_0(7e42497f1c7144fb0e140a0b400cebcec7e47bf813767f934fcc6820bccf9935): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.933379 4744 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n_openshift-operators_a755ef85-a445-4cd1-bb7b-4bea0bb7b796_0(7e42497f1c7144fb0e140a0b400cebcec7e47bf813767f934fcc6820bccf9935): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:22:46 crc kubenswrapper[4744]: E1205 20:22:46.933428 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n_openshift-operators(a755ef85-a445-4cd1-bb7b-4bea0bb7b796)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n_openshift-operators(a755ef85-a445-4cd1-bb7b-4bea0bb7b796)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n_openshift-operators_a755ef85-a445-4cd1-bb7b-4bea0bb7b796_0(7e42497f1c7144fb0e140a0b400cebcec7e47bf813767f934fcc6820bccf9935): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" podUID="a755ef85-a445-4cd1-bb7b-4bea0bb7b796" Dec 05 20:22:58 crc kubenswrapper[4744]: I1205 20:22:58.079913 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:58 crc kubenswrapper[4744]: I1205 20:22:58.080716 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:22:58 crc kubenswrapper[4744]: I1205 20:22:58.313073 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-52fcv"] Dec 05 20:22:58 crc kubenswrapper[4744]: W1205 20:22:58.328663 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb4f5a8a_a57f_4988_8b36_f8926084fce9.slice/crio-22f3dc4dcbf342cddf05a78833fc1e54cbfccc742cab97503f3549337cd66685 WatchSource:0}: Error finding container 22f3dc4dcbf342cddf05a78833fc1e54cbfccc742cab97503f3549337cd66685: Status 404 returned error can't find the container with id 22f3dc4dcbf342cddf05a78833fc1e54cbfccc742cab97503f3549337cd66685 Dec 05 20:22:59 crc kubenswrapper[4744]: I1205 20:22:59.004824 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-52fcv" event={"ID":"db4f5a8a-a57f-4988-8b36-f8926084fce9","Type":"ContainerStarted","Data":"22f3dc4dcbf342cddf05a78833fc1e54cbfccc742cab97503f3549337cd66685"} Dec 05 20:22:59 crc kubenswrapper[4744]: I1205 20:22:59.080498 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:59 crc kubenswrapper[4744]: I1205 20:22:59.081016 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" Dec 05 20:22:59 crc kubenswrapper[4744]: I1205 20:22:59.388492 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8"] Dec 05 20:22:59 crc kubenswrapper[4744]: W1205 20:22:59.396843 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73c8ea4a_800d_4d94_9732_f81484c43481.slice/crio-78cbe4c7712e09043858477fcb64ebf30c03695a0233db5d7651720906bdc352 WatchSource:0}: Error finding container 78cbe4c7712e09043858477fcb64ebf30c03695a0233db5d7651720906bdc352: Status 404 returned error can't find the container with id 78cbe4c7712e09043858477fcb64ebf30c03695a0233db5d7651720906bdc352 Dec 05 20:23:00 crc kubenswrapper[4744]: I1205 20:23:00.020171 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" event={"ID":"73c8ea4a-800d-4d94-9732-f81484c43481","Type":"ContainerStarted","Data":"78cbe4c7712e09043858477fcb64ebf30c03695a0233db5d7651720906bdc352"} Dec 05 20:23:02 crc kubenswrapper[4744]: I1205 20:23:02.083411 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:23:02 crc kubenswrapper[4744]: I1205 20:23:02.083475 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:23:02 crc kubenswrapper[4744]: I1205 20:23:02.083604 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" Dec 05 20:23:02 crc kubenswrapper[4744]: I1205 20:23:02.084094 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" Dec 05 20:23:02 crc kubenswrapper[4744]: I1205 20:23:02.084233 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" Dec 05 20:23:02 crc kubenswrapper[4744]: I1205 20:23:02.084491 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:23:03 crc kubenswrapper[4744]: I1205 20:23:03.472813 4744 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 20:23:07 crc kubenswrapper[4744]: I1205 20:23:07.108800 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-nlfb9"] Dec 05 20:23:07 crc kubenswrapper[4744]: W1205 20:23:07.354444 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda755ef85_a445_4cd1_bb7b_4bea0bb7b796.slice/crio-70266efa0dff1fc064258c94b8354ff9f1056e0ccba32c21b9799804e889089a WatchSource:0}: Error finding container 70266efa0dff1fc064258c94b8354ff9f1056e0ccba32c21b9799804e889089a: Status 404 returned error can't find the container with id 70266efa0dff1fc064258c94b8354ff9f1056e0ccba32c21b9799804e889089a Dec 05 20:23:07 crc kubenswrapper[4744]: I1205 20:23:07.355632 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n"] Dec 05 20:23:07 crc kubenswrapper[4744]: W1205 20:23:07.358194 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6cb3e32_cc6f_4091_ae30_5de5790d952c.slice/crio-064199f7f2f0e084ae167613d4ae7eea2b4a3b9962d369e1f17070a2731fb91a WatchSource:0}: Error finding container 064199f7f2f0e084ae167613d4ae7eea2b4a3b9962d369e1f17070a2731fb91a: Status 404 returned error can't find the container with id 064199f7f2f0e084ae167613d4ae7eea2b4a3b9962d369e1f17070a2731fb91a Dec 05 20:23:07 crc kubenswrapper[4744]: I1205 20:23:07.359568 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf"] Dec 05 20:23:07 crc kubenswrapper[4744]: I1205 20:23:07.530564 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m5crs" Dec 05 20:23:08 crc kubenswrapper[4744]: I1205 20:23:08.088590 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" event={"ID":"f152504a-5f82-434d-904e-b9e1f2e49a5e","Type":"ContainerStarted","Data":"ec51417ff57d6d3d53fdb6d254f78eb923ebc20a426764c55d74b305cc2fa08b"} Dec 05 20:23:08 crc kubenswrapper[4744]: I1205 20:23:08.089244 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" event={"ID":"73c8ea4a-800d-4d94-9732-f81484c43481","Type":"ContainerStarted","Data":"e934e4aeb8d26293d170a5b8d550f08e7d632f3fc0255cfea59d8cb0b560f1d9"} Dec 05 20:23:08 crc kubenswrapper[4744]: I1205 20:23:08.090206 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" event={"ID":"d6cb3e32-cc6f-4091-ae30-5de5790d952c","Type":"ContainerStarted","Data":"064199f7f2f0e084ae167613d4ae7eea2b4a3b9962d369e1f17070a2731fb91a"} Dec 05 20:23:08 crc kubenswrapper[4744]: I1205 20:23:08.092443 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-52fcv" event={"ID":"db4f5a8a-a57f-4988-8b36-f8926084fce9","Type":"ContainerStarted","Data":"edb1c8461f59013c2c4ec9166f93859a0866ea873d9f095e2ead463f9d7bf100"} Dec 05 20:23:08 crc kubenswrapper[4744]: I1205 20:23:08.092570 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:23:08 crc kubenswrapper[4744]: I1205 20:23:08.095069 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" event={"ID":"a755ef85-a445-4cd1-bb7b-4bea0bb7b796","Type":"ContainerStarted","Data":"2a4f17dd629dbffa3b14c676fc9c41f016efb471ea22e9c9c0743d71c53f09e0"} Dec 05 20:23:08 crc kubenswrapper[4744]: I1205 20:23:08.095102 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" event={"ID":"a755ef85-a445-4cd1-bb7b-4bea0bb7b796","Type":"ContainerStarted","Data":"70266efa0dff1fc064258c94b8354ff9f1056e0ccba32c21b9799804e889089a"} Dec 05 20:23:08 crc kubenswrapper[4744]: I1205 20:23:08.112649 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8" podStartSLOduration=17.621003716 podStartE2EDuration="25.112615419s" podCreationTimestamp="2025-12-05 20:22:43 +0000 UTC" firstStartedPulling="2025-12-05 20:22:59.399267181 +0000 UTC m=+749.629078549" lastFinishedPulling="2025-12-05 20:23:06.890878884 +0000 UTC m=+757.120690252" observedRunningTime="2025-12-05 20:23:08.11017485 +0000 UTC m=+758.339986308" watchObservedRunningTime="2025-12-05 20:23:08.112615419 +0000 UTC m=+758.342426787" Dec 05 20:23:08 crc kubenswrapper[4744]: I1205 20:23:08.159982 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-52fcv" podStartSLOduration=16.629003133 podStartE2EDuration="25.159957175s" podCreationTimestamp="2025-12-05 20:22:43 +0000 UTC" firstStartedPulling="2025-12-05 20:22:58.33150046 +0000 UTC m=+748.561311828" lastFinishedPulling="2025-12-05 20:23:06.862454502 +0000 UTC m=+757.092265870" observedRunningTime="2025-12-05 20:23:08.159903694 +0000 UTC m=+758.389715072" watchObservedRunningTime="2025-12-05 20:23:08.159957175 +0000 UTC m=+758.389768533" Dec 05 20:23:10 crc kubenswrapper[4744]: I1205 20:23:10.109486 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n" podStartSLOduration=27.109462361 podStartE2EDuration="27.109462361s" podCreationTimestamp="2025-12-05 20:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:23:08.190810135 +0000 UTC m=+758.420621503" watchObservedRunningTime="2025-12-05 20:23:10.109462361 +0000 UTC m=+760.339273729" Dec 05 20:23:14 crc kubenswrapper[4744]: I1205 20:23:14.088366 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-52fcv" Dec 05 20:23:15 crc kubenswrapper[4744]: I1205 20:23:15.139446 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" event={"ID":"f152504a-5f82-434d-904e-b9e1f2e49a5e","Type":"ContainerStarted","Data":"280dcc6b45572de66907deb09c69e9f4f938ce50068891d886c6d10118c6c57a"} Dec 05 20:23:15 crc kubenswrapper[4744]: I1205 20:23:15.139927 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:23:15 crc kubenswrapper[4744]: I1205 20:23:15.143436 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" event={"ID":"d6cb3e32-cc6f-4091-ae30-5de5790d952c","Type":"ContainerStarted","Data":"e67c76dee18ea0797efe156b792a3f7155c6415b6bef7139a6d6f3dc1e10025a"} Dec 05 20:23:15 crc kubenswrapper[4744]: I1205 20:23:15.146961 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" Dec 05 20:23:15 crc kubenswrapper[4744]: I1205 20:23:15.195345 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-6vnbf" podStartSLOduration=25.211591544 podStartE2EDuration="32.195316121s" podCreationTimestamp="2025-12-05 20:22:43 +0000 UTC" firstStartedPulling="2025-12-05 20:23:07.360345789 +0000 UTC m=+757.590157157" lastFinishedPulling="2025-12-05 20:23:14.344070366 +0000 UTC m=+764.573881734" observedRunningTime="2025-12-05 20:23:15.19192137 +0000 UTC m=+765.421732738" watchObservedRunningTime="2025-12-05 20:23:15.195316121 +0000 UTC m=+765.425127529" Dec 05 20:23:15 crc kubenswrapper[4744]: I1205 20:23:15.196545 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-nlfb9" podStartSLOduration=24.947945217 podStartE2EDuration="32.19653647s" podCreationTimestamp="2025-12-05 20:22:43 +0000 UTC" firstStartedPulling="2025-12-05 20:23:07.14159303 +0000 UTC m=+757.371404398" lastFinishedPulling="2025-12-05 20:23:14.390184283 +0000 UTC m=+764.619995651" observedRunningTime="2025-12-05 20:23:15.172270949 +0000 UTC m=+765.402082357" watchObservedRunningTime="2025-12-05 20:23:15.19653647 +0000 UTC m=+765.426347878" Dec 05 20:23:24 crc kubenswrapper[4744]: I1205 20:23:24.217480 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5"] Dec 05 20:23:24 crc kubenswrapper[4744]: I1205 20:23:24.219274 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" Dec 05 20:23:24 crc kubenswrapper[4744]: I1205 20:23:24.221934 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 20:23:24 crc kubenswrapper[4744]: I1205 20:23:24.233890 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5"] Dec 05 20:23:24 crc kubenswrapper[4744]: I1205 20:23:24.413954 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a23e281a-f6ab-488a-97f1-e8854dedc3c3-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5\" (UID: \"a23e281a-f6ab-488a-97f1-e8854dedc3c3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" Dec 05 20:23:24 crc kubenswrapper[4744]: I1205 20:23:24.414043 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a23e281a-f6ab-488a-97f1-e8854dedc3c3-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5\" (UID: \"a23e281a-f6ab-488a-97f1-e8854dedc3c3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" Dec 05 20:23:24 crc kubenswrapper[4744]: I1205 20:23:24.414089 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bjrp\" (UniqueName: \"kubernetes.io/projected/a23e281a-f6ab-488a-97f1-e8854dedc3c3-kube-api-access-7bjrp\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5\" (UID: \"a23e281a-f6ab-488a-97f1-e8854dedc3c3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" Dec 05 20:23:24 crc kubenswrapper[4744]: I1205 20:23:24.516211 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a23e281a-f6ab-488a-97f1-e8854dedc3c3-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5\" (UID: \"a23e281a-f6ab-488a-97f1-e8854dedc3c3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" Dec 05 20:23:24 crc kubenswrapper[4744]: I1205 20:23:24.516349 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a23e281a-f6ab-488a-97f1-e8854dedc3c3-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5\" (UID: \"a23e281a-f6ab-488a-97f1-e8854dedc3c3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" Dec 05 20:23:24 crc kubenswrapper[4744]: I1205 20:23:24.516425 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bjrp\" (UniqueName: \"kubernetes.io/projected/a23e281a-f6ab-488a-97f1-e8854dedc3c3-kube-api-access-7bjrp\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5\" (UID: \"a23e281a-f6ab-488a-97f1-e8854dedc3c3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" Dec 05 20:23:24 crc kubenswrapper[4744]: I1205 20:23:24.517070 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a23e281a-f6ab-488a-97f1-e8854dedc3c3-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5\" (UID: \"a23e281a-f6ab-488a-97f1-e8854dedc3c3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" Dec 05 20:23:24 crc kubenswrapper[4744]: I1205 20:23:24.517093 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a23e281a-f6ab-488a-97f1-e8854dedc3c3-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5\" (UID: \"a23e281a-f6ab-488a-97f1-e8854dedc3c3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" Dec 05 20:23:24 crc kubenswrapper[4744]: I1205 20:23:24.546795 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bjrp\" (UniqueName: \"kubernetes.io/projected/a23e281a-f6ab-488a-97f1-e8854dedc3c3-kube-api-access-7bjrp\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5\" (UID: \"a23e281a-f6ab-488a-97f1-e8854dedc3c3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" Dec 05 20:23:24 crc kubenswrapper[4744]: I1205 20:23:24.843706 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" Dec 05 20:23:25 crc kubenswrapper[4744]: I1205 20:23:25.061574 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5"] Dec 05 20:23:25 crc kubenswrapper[4744]: W1205 20:23:25.069805 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda23e281a_f6ab_488a_97f1_e8854dedc3c3.slice/crio-bbf1114a5e9abd8a46e9a6c177cf973490682b17490c63304e29d09afd25d080 WatchSource:0}: Error finding container bbf1114a5e9abd8a46e9a6c177cf973490682b17490c63304e29d09afd25d080: Status 404 returned error can't find the container with id bbf1114a5e9abd8a46e9a6c177cf973490682b17490c63304e29d09afd25d080 Dec 05 20:23:25 crc kubenswrapper[4744]: I1205 20:23:25.214056 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" event={"ID":"a23e281a-f6ab-488a-97f1-e8854dedc3c3","Type":"ContainerStarted","Data":"bbf1114a5e9abd8a46e9a6c177cf973490682b17490c63304e29d09afd25d080"} Dec 05 20:23:26 crc kubenswrapper[4744]: I1205 20:23:26.219370 4744 generic.go:334] "Generic (PLEG): container finished" podID="a23e281a-f6ab-488a-97f1-e8854dedc3c3" containerID="5c1c9574b345dd64f3f105bcf48c7aebc50e84b64bdc8282c522ffa17e9a4606" exitCode=0 Dec 05 20:23:26 crc kubenswrapper[4744]: I1205 20:23:26.219577 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" event={"ID":"a23e281a-f6ab-488a-97f1-e8854dedc3c3","Type":"ContainerDied","Data":"5c1c9574b345dd64f3f105bcf48c7aebc50e84b64bdc8282c522ffa17e9a4606"} Dec 05 20:23:27 crc kubenswrapper[4744]: I1205 20:23:27.773452 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kdhxj"] Dec 05 20:23:27 crc kubenswrapper[4744]: I1205 20:23:27.775210 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:27 crc kubenswrapper[4744]: I1205 20:23:27.784907 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kdhxj"] Dec 05 20:23:27 crc kubenswrapper[4744]: I1205 20:23:27.859549 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/378c2d85-6854-4d11-b648-dd6766a8453e-utilities\") pod \"redhat-operators-kdhxj\" (UID: \"378c2d85-6854-4d11-b648-dd6766a8453e\") " pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:27 crc kubenswrapper[4744]: I1205 20:23:27.859609 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h5gc\" (UniqueName: \"kubernetes.io/projected/378c2d85-6854-4d11-b648-dd6766a8453e-kube-api-access-9h5gc\") pod \"redhat-operators-kdhxj\" (UID: \"378c2d85-6854-4d11-b648-dd6766a8453e\") " pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:27 crc kubenswrapper[4744]: I1205 20:23:27.859769 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/378c2d85-6854-4d11-b648-dd6766a8453e-catalog-content\") pod \"redhat-operators-kdhxj\" (UID: \"378c2d85-6854-4d11-b648-dd6766a8453e\") " pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:27 crc kubenswrapper[4744]: I1205 20:23:27.960676 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h5gc\" (UniqueName: \"kubernetes.io/projected/378c2d85-6854-4d11-b648-dd6766a8453e-kube-api-access-9h5gc\") pod \"redhat-operators-kdhxj\" (UID: \"378c2d85-6854-4d11-b648-dd6766a8453e\") " pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:27 crc kubenswrapper[4744]: I1205 20:23:27.960759 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/378c2d85-6854-4d11-b648-dd6766a8453e-catalog-content\") pod \"redhat-operators-kdhxj\" (UID: \"378c2d85-6854-4d11-b648-dd6766a8453e\") " pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:27 crc kubenswrapper[4744]: I1205 20:23:27.960802 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/378c2d85-6854-4d11-b648-dd6766a8453e-utilities\") pod \"redhat-operators-kdhxj\" (UID: \"378c2d85-6854-4d11-b648-dd6766a8453e\") " pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:27 crc kubenswrapper[4744]: I1205 20:23:27.961231 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/378c2d85-6854-4d11-b648-dd6766a8453e-utilities\") pod \"redhat-operators-kdhxj\" (UID: \"378c2d85-6854-4d11-b648-dd6766a8453e\") " pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:27 crc kubenswrapper[4744]: I1205 20:23:27.961427 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/378c2d85-6854-4d11-b648-dd6766a8453e-catalog-content\") pod \"redhat-operators-kdhxj\" (UID: \"378c2d85-6854-4d11-b648-dd6766a8453e\") " pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:27 crc kubenswrapper[4744]: I1205 20:23:27.981836 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h5gc\" (UniqueName: \"kubernetes.io/projected/378c2d85-6854-4d11-b648-dd6766a8453e-kube-api-access-9h5gc\") pod \"redhat-operators-kdhxj\" (UID: \"378c2d85-6854-4d11-b648-dd6766a8453e\") " pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:28 crc kubenswrapper[4744]: I1205 20:23:28.096601 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:28 crc kubenswrapper[4744]: I1205 20:23:28.320748 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kdhxj"] Dec 05 20:23:28 crc kubenswrapper[4744]: W1205 20:23:28.340076 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod378c2d85_6854_4d11_b648_dd6766a8453e.slice/crio-df9a890a0e93094b97dae67822e6098fe29c04791ef912514e723a24448cfea9 WatchSource:0}: Error finding container df9a890a0e93094b97dae67822e6098fe29c04791ef912514e723a24448cfea9: Status 404 returned error can't find the container with id df9a890a0e93094b97dae67822e6098fe29c04791ef912514e723a24448cfea9 Dec 05 20:23:29 crc kubenswrapper[4744]: I1205 20:23:29.237337 4744 generic.go:334] "Generic (PLEG): container finished" podID="a23e281a-f6ab-488a-97f1-e8854dedc3c3" containerID="993ca19dea5ffd0d1cd44b95f20a23709cbb1460b1e19357f9dbfedfb92a9d35" exitCode=0 Dec 05 20:23:29 crc kubenswrapper[4744]: I1205 20:23:29.237439 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" event={"ID":"a23e281a-f6ab-488a-97f1-e8854dedc3c3","Type":"ContainerDied","Data":"993ca19dea5ffd0d1cd44b95f20a23709cbb1460b1e19357f9dbfedfb92a9d35"} Dec 05 20:23:29 crc kubenswrapper[4744]: I1205 20:23:29.238419 4744 generic.go:334] "Generic (PLEG): container finished" podID="378c2d85-6854-4d11-b648-dd6766a8453e" containerID="e6613c028fa5a8d0eb70f6180b0325241056e7d58b3c0af7b351279a5abb5271" exitCode=0 Dec 05 20:23:29 crc kubenswrapper[4744]: I1205 20:23:29.238445 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kdhxj" event={"ID":"378c2d85-6854-4d11-b648-dd6766a8453e","Type":"ContainerDied","Data":"e6613c028fa5a8d0eb70f6180b0325241056e7d58b3c0af7b351279a5abb5271"} Dec 05 20:23:29 crc kubenswrapper[4744]: I1205 20:23:29.238472 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kdhxj" event={"ID":"378c2d85-6854-4d11-b648-dd6766a8453e","Type":"ContainerStarted","Data":"df9a890a0e93094b97dae67822e6098fe29c04791ef912514e723a24448cfea9"} Dec 05 20:23:30 crc kubenswrapper[4744]: I1205 20:23:30.245282 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kdhxj" event={"ID":"378c2d85-6854-4d11-b648-dd6766a8453e","Type":"ContainerStarted","Data":"2fee2fdf412bf8a6bd87fc587e5bf43f8dc70912e1468dd27918f07dcab02dec"} Dec 05 20:23:30 crc kubenswrapper[4744]: I1205 20:23:30.247312 4744 generic.go:334] "Generic (PLEG): container finished" podID="a23e281a-f6ab-488a-97f1-e8854dedc3c3" containerID="cbb49463063f3e604512d369cf49a7e170c4ebc4cd9be551a24e73849a92be51" exitCode=0 Dec 05 20:23:30 crc kubenswrapper[4744]: I1205 20:23:30.247342 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" event={"ID":"a23e281a-f6ab-488a-97f1-e8854dedc3c3","Type":"ContainerDied","Data":"cbb49463063f3e604512d369cf49a7e170c4ebc4cd9be551a24e73849a92be51"} Dec 05 20:23:31 crc kubenswrapper[4744]: I1205 20:23:31.254114 4744 generic.go:334] "Generic (PLEG): container finished" podID="378c2d85-6854-4d11-b648-dd6766a8453e" containerID="2fee2fdf412bf8a6bd87fc587e5bf43f8dc70912e1468dd27918f07dcab02dec" exitCode=0 Dec 05 20:23:31 crc kubenswrapper[4744]: I1205 20:23:31.254160 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kdhxj" event={"ID":"378c2d85-6854-4d11-b648-dd6766a8453e","Type":"ContainerDied","Data":"2fee2fdf412bf8a6bd87fc587e5bf43f8dc70912e1468dd27918f07dcab02dec"} Dec 05 20:23:31 crc kubenswrapper[4744]: I1205 20:23:31.453605 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" Dec 05 20:23:31 crc kubenswrapper[4744]: I1205 20:23:31.523417 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a23e281a-f6ab-488a-97f1-e8854dedc3c3-util\") pod \"a23e281a-f6ab-488a-97f1-e8854dedc3c3\" (UID: \"a23e281a-f6ab-488a-97f1-e8854dedc3c3\") " Dec 05 20:23:31 crc kubenswrapper[4744]: I1205 20:23:31.523841 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a23e281a-f6ab-488a-97f1-e8854dedc3c3-bundle\") pod \"a23e281a-f6ab-488a-97f1-e8854dedc3c3\" (UID: \"a23e281a-f6ab-488a-97f1-e8854dedc3c3\") " Dec 05 20:23:31 crc kubenswrapper[4744]: I1205 20:23:31.523972 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bjrp\" (UniqueName: \"kubernetes.io/projected/a23e281a-f6ab-488a-97f1-e8854dedc3c3-kube-api-access-7bjrp\") pod \"a23e281a-f6ab-488a-97f1-e8854dedc3c3\" (UID: \"a23e281a-f6ab-488a-97f1-e8854dedc3c3\") " Dec 05 20:23:31 crc kubenswrapper[4744]: I1205 20:23:31.524428 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a23e281a-f6ab-488a-97f1-e8854dedc3c3-bundle" (OuterVolumeSpecName: "bundle") pod "a23e281a-f6ab-488a-97f1-e8854dedc3c3" (UID: "a23e281a-f6ab-488a-97f1-e8854dedc3c3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:23:31 crc kubenswrapper[4744]: I1205 20:23:31.528801 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a23e281a-f6ab-488a-97f1-e8854dedc3c3-kube-api-access-7bjrp" (OuterVolumeSpecName: "kube-api-access-7bjrp") pod "a23e281a-f6ab-488a-97f1-e8854dedc3c3" (UID: "a23e281a-f6ab-488a-97f1-e8854dedc3c3"). InnerVolumeSpecName "kube-api-access-7bjrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:31 crc kubenswrapper[4744]: I1205 20:23:31.534096 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a23e281a-f6ab-488a-97f1-e8854dedc3c3-util" (OuterVolumeSpecName: "util") pod "a23e281a-f6ab-488a-97f1-e8854dedc3c3" (UID: "a23e281a-f6ab-488a-97f1-e8854dedc3c3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:23:31 crc kubenswrapper[4744]: I1205 20:23:31.625485 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bjrp\" (UniqueName: \"kubernetes.io/projected/a23e281a-f6ab-488a-97f1-e8854dedc3c3-kube-api-access-7bjrp\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:31 crc kubenswrapper[4744]: I1205 20:23:31.625777 4744 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a23e281a-f6ab-488a-97f1-e8854dedc3c3-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:31 crc kubenswrapper[4744]: I1205 20:23:31.625789 4744 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a23e281a-f6ab-488a-97f1-e8854dedc3c3-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:32 crc kubenswrapper[4744]: I1205 20:23:32.262738 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kdhxj" event={"ID":"378c2d85-6854-4d11-b648-dd6766a8453e","Type":"ContainerStarted","Data":"74424b2e429343100d675713e22dabf58d3e36d1141a207bf027a8dafcb44670"} Dec 05 20:23:32 crc kubenswrapper[4744]: I1205 20:23:32.265201 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" event={"ID":"a23e281a-f6ab-488a-97f1-e8854dedc3c3","Type":"ContainerDied","Data":"bbf1114a5e9abd8a46e9a6c177cf973490682b17490c63304e29d09afd25d080"} Dec 05 20:23:32 crc kubenswrapper[4744]: I1205 20:23:32.265244 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbf1114a5e9abd8a46e9a6c177cf973490682b17490c63304e29d09afd25d080" Dec 05 20:23:32 crc kubenswrapper[4744]: I1205 20:23:32.265702 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5" Dec 05 20:23:33 crc kubenswrapper[4744]: I1205 20:23:33.286759 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kdhxj" podStartSLOduration=3.538054156 podStartE2EDuration="6.28674424s" podCreationTimestamp="2025-12-05 20:23:27 +0000 UTC" firstStartedPulling="2025-12-05 20:23:29.239819788 +0000 UTC m=+779.469631156" lastFinishedPulling="2025-12-05 20:23:31.988509882 +0000 UTC m=+782.218321240" observedRunningTime="2025-12-05 20:23:33.283141349 +0000 UTC m=+783.512952717" watchObservedRunningTime="2025-12-05 20:23:33.28674424 +0000 UTC m=+783.516555608" Dec 05 20:23:33 crc kubenswrapper[4744]: I1205 20:23:33.713521 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-6gznm"] Dec 05 20:23:33 crc kubenswrapper[4744]: E1205 20:23:33.714080 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23e281a-f6ab-488a-97f1-e8854dedc3c3" containerName="extract" Dec 05 20:23:33 crc kubenswrapper[4744]: I1205 20:23:33.714101 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23e281a-f6ab-488a-97f1-e8854dedc3c3" containerName="extract" Dec 05 20:23:33 crc kubenswrapper[4744]: E1205 20:23:33.714117 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23e281a-f6ab-488a-97f1-e8854dedc3c3" containerName="pull" Dec 05 20:23:33 crc kubenswrapper[4744]: I1205 20:23:33.714125 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23e281a-f6ab-488a-97f1-e8854dedc3c3" containerName="pull" Dec 05 20:23:33 crc kubenswrapper[4744]: E1205 20:23:33.714135 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23e281a-f6ab-488a-97f1-e8854dedc3c3" containerName="util" Dec 05 20:23:33 crc kubenswrapper[4744]: I1205 20:23:33.714143 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23e281a-f6ab-488a-97f1-e8854dedc3c3" containerName="util" Dec 05 20:23:33 crc kubenswrapper[4744]: I1205 20:23:33.714273 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a23e281a-f6ab-488a-97f1-e8854dedc3c3" containerName="extract" Dec 05 20:23:33 crc kubenswrapper[4744]: I1205 20:23:33.714735 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6gznm" Dec 05 20:23:33 crc kubenswrapper[4744]: I1205 20:23:33.716540 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-lbpzc" Dec 05 20:23:33 crc kubenswrapper[4744]: I1205 20:23:33.717111 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 05 20:23:33 crc kubenswrapper[4744]: I1205 20:23:33.726178 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 05 20:23:33 crc kubenswrapper[4744]: I1205 20:23:33.729238 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-6gznm"] Dec 05 20:23:33 crc kubenswrapper[4744]: I1205 20:23:33.750757 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9xqt\" (UniqueName: \"kubernetes.io/projected/b9185029-82a1-4112-9539-86612a761dd9-kube-api-access-s9xqt\") pod \"nmstate-operator-5b5b58f5c8-6gznm\" (UID: \"b9185029-82a1-4112-9539-86612a761dd9\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6gznm" Dec 05 20:23:33 crc kubenswrapper[4744]: I1205 20:23:33.852222 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9xqt\" (UniqueName: \"kubernetes.io/projected/b9185029-82a1-4112-9539-86612a761dd9-kube-api-access-s9xqt\") pod \"nmstate-operator-5b5b58f5c8-6gznm\" (UID: \"b9185029-82a1-4112-9539-86612a761dd9\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6gznm" Dec 05 20:23:33 crc kubenswrapper[4744]: I1205 20:23:33.871114 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9xqt\" (UniqueName: \"kubernetes.io/projected/b9185029-82a1-4112-9539-86612a761dd9-kube-api-access-s9xqt\") pod \"nmstate-operator-5b5b58f5c8-6gznm\" (UID: \"b9185029-82a1-4112-9539-86612a761dd9\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6gznm" Dec 05 20:23:34 crc kubenswrapper[4744]: I1205 20:23:34.029024 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6gznm" Dec 05 20:23:34 crc kubenswrapper[4744]: I1205 20:23:34.263275 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-6gznm"] Dec 05 20:23:34 crc kubenswrapper[4744]: W1205 20:23:34.270507 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9185029_82a1_4112_9539_86612a761dd9.slice/crio-44cb6d1317897a1534ebfa4e31e2e720d4ed00ec4e7a9b9ab14abb0d5141d332 WatchSource:0}: Error finding container 44cb6d1317897a1534ebfa4e31e2e720d4ed00ec4e7a9b9ab14abb0d5141d332: Status 404 returned error can't find the container with id 44cb6d1317897a1534ebfa4e31e2e720d4ed00ec4e7a9b9ab14abb0d5141d332 Dec 05 20:23:35 crc kubenswrapper[4744]: I1205 20:23:35.292907 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6gznm" event={"ID":"b9185029-82a1-4112-9539-86612a761dd9","Type":"ContainerStarted","Data":"44cb6d1317897a1534ebfa4e31e2e720d4ed00ec4e7a9b9ab14abb0d5141d332"} Dec 05 20:23:38 crc kubenswrapper[4744]: I1205 20:23:38.096870 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:38 crc kubenswrapper[4744]: I1205 20:23:38.097326 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:38 crc kubenswrapper[4744]: I1205 20:23:38.160568 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:38 crc kubenswrapper[4744]: I1205 20:23:38.370604 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:41 crc kubenswrapper[4744]: I1205 20:23:41.763145 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kdhxj"] Dec 05 20:23:41 crc kubenswrapper[4744]: I1205 20:23:41.763670 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kdhxj" podUID="378c2d85-6854-4d11-b648-dd6766a8453e" containerName="registry-server" containerID="cri-o://74424b2e429343100d675713e22dabf58d3e36d1141a207bf027a8dafcb44670" gracePeriod=2 Dec 05 20:23:45 crc kubenswrapper[4744]: I1205 20:23:45.352009 4744 generic.go:334] "Generic (PLEG): container finished" podID="378c2d85-6854-4d11-b648-dd6766a8453e" containerID="74424b2e429343100d675713e22dabf58d3e36d1141a207bf027a8dafcb44670" exitCode=0 Dec 05 20:23:45 crc kubenswrapper[4744]: I1205 20:23:45.352219 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kdhxj" event={"ID":"378c2d85-6854-4d11-b648-dd6766a8453e","Type":"ContainerDied","Data":"74424b2e429343100d675713e22dabf58d3e36d1141a207bf027a8dafcb44670"} Dec 05 20:23:45 crc kubenswrapper[4744]: I1205 20:23:45.832777 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:45 crc kubenswrapper[4744]: I1205 20:23:45.901651 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/378c2d85-6854-4d11-b648-dd6766a8453e-catalog-content\") pod \"378c2d85-6854-4d11-b648-dd6766a8453e\" (UID: \"378c2d85-6854-4d11-b648-dd6766a8453e\") " Dec 05 20:23:45 crc kubenswrapper[4744]: I1205 20:23:45.901708 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h5gc\" (UniqueName: \"kubernetes.io/projected/378c2d85-6854-4d11-b648-dd6766a8453e-kube-api-access-9h5gc\") pod \"378c2d85-6854-4d11-b648-dd6766a8453e\" (UID: \"378c2d85-6854-4d11-b648-dd6766a8453e\") " Dec 05 20:23:45 crc kubenswrapper[4744]: I1205 20:23:45.901735 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/378c2d85-6854-4d11-b648-dd6766a8453e-utilities\") pod \"378c2d85-6854-4d11-b648-dd6766a8453e\" (UID: \"378c2d85-6854-4d11-b648-dd6766a8453e\") " Dec 05 20:23:45 crc kubenswrapper[4744]: I1205 20:23:45.902858 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/378c2d85-6854-4d11-b648-dd6766a8453e-utilities" (OuterVolumeSpecName: "utilities") pod "378c2d85-6854-4d11-b648-dd6766a8453e" (UID: "378c2d85-6854-4d11-b648-dd6766a8453e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:23:45 crc kubenswrapper[4744]: I1205 20:23:45.906452 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/378c2d85-6854-4d11-b648-dd6766a8453e-kube-api-access-9h5gc" (OuterVolumeSpecName: "kube-api-access-9h5gc") pod "378c2d85-6854-4d11-b648-dd6766a8453e" (UID: "378c2d85-6854-4d11-b648-dd6766a8453e"). InnerVolumeSpecName "kube-api-access-9h5gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:23:46 crc kubenswrapper[4744]: I1205 20:23:46.003153 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h5gc\" (UniqueName: \"kubernetes.io/projected/378c2d85-6854-4d11-b648-dd6766a8453e-kube-api-access-9h5gc\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:46 crc kubenswrapper[4744]: I1205 20:23:46.003193 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/378c2d85-6854-4d11-b648-dd6766a8453e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:46 crc kubenswrapper[4744]: I1205 20:23:46.007096 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/378c2d85-6854-4d11-b648-dd6766a8453e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "378c2d85-6854-4d11-b648-dd6766a8453e" (UID: "378c2d85-6854-4d11-b648-dd6766a8453e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:23:46 crc kubenswrapper[4744]: I1205 20:23:46.103914 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/378c2d85-6854-4d11-b648-dd6766a8453e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:23:46 crc kubenswrapper[4744]: I1205 20:23:46.361479 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kdhxj" Dec 05 20:23:46 crc kubenswrapper[4744]: I1205 20:23:46.363213 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kdhxj" event={"ID":"378c2d85-6854-4d11-b648-dd6766a8453e","Type":"ContainerDied","Data":"df9a890a0e93094b97dae67822e6098fe29c04791ef912514e723a24448cfea9"} Dec 05 20:23:46 crc kubenswrapper[4744]: I1205 20:23:46.363398 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6gznm" event={"ID":"b9185029-82a1-4112-9539-86612a761dd9","Type":"ContainerStarted","Data":"900b834cc2c2b62458bfa81c33d84d085bf651367fd0f84fde3a9b8150ab5d25"} Dec 05 20:23:46 crc kubenswrapper[4744]: I1205 20:23:46.363546 4744 scope.go:117] "RemoveContainer" containerID="74424b2e429343100d675713e22dabf58d3e36d1141a207bf027a8dafcb44670" Dec 05 20:23:46 crc kubenswrapper[4744]: I1205 20:23:46.387821 4744 scope.go:117] "RemoveContainer" containerID="2fee2fdf412bf8a6bd87fc587e5bf43f8dc70912e1468dd27918f07dcab02dec" Dec 05 20:23:46 crc kubenswrapper[4744]: I1205 20:23:46.388328 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kdhxj"] Dec 05 20:23:46 crc kubenswrapper[4744]: I1205 20:23:46.412800 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kdhxj"] Dec 05 20:23:46 crc kubenswrapper[4744]: I1205 20:23:46.416093 4744 scope.go:117] "RemoveContainer" containerID="e6613c028fa5a8d0eb70f6180b0325241056e7d58b3c0af7b351279a5abb5271" Dec 05 20:23:46 crc kubenswrapper[4744]: I1205 20:23:46.416248 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6gznm" podStartSLOduration=2.18832268 podStartE2EDuration="13.4162298s" podCreationTimestamp="2025-12-05 20:23:33 +0000 UTC" firstStartedPulling="2025-12-05 20:23:34.273708859 +0000 UTC m=+784.503520237" lastFinishedPulling="2025-12-05 20:23:45.501615989 +0000 UTC m=+795.731427357" observedRunningTime="2025-12-05 20:23:46.401744825 +0000 UTC m=+796.631556203" watchObservedRunningTime="2025-12-05 20:23:46.4162298 +0000 UTC m=+796.646041188" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.279915 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-nzkjk"] Dec 05 20:23:47 crc kubenswrapper[4744]: E1205 20:23:47.280186 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378c2d85-6854-4d11-b648-dd6766a8453e" containerName="registry-server" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.280207 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="378c2d85-6854-4d11-b648-dd6766a8453e" containerName="registry-server" Dec 05 20:23:47 crc kubenswrapper[4744]: E1205 20:23:47.280224 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378c2d85-6854-4d11-b648-dd6766a8453e" containerName="extract-utilities" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.280230 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="378c2d85-6854-4d11-b648-dd6766a8453e" containerName="extract-utilities" Dec 05 20:23:47 crc kubenswrapper[4744]: E1205 20:23:47.280242 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378c2d85-6854-4d11-b648-dd6766a8453e" containerName="extract-content" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.280249 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="378c2d85-6854-4d11-b648-dd6766a8453e" containerName="extract-content" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.280361 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="378c2d85-6854-4d11-b648-dd6766a8453e" containerName="registry-server" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.281045 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nzkjk" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.282323 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-c922f" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.291164 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-nzkjk"] Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.296026 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z"] Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.296705 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.298654 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.322426 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs6mr\" (UniqueName: \"kubernetes.io/projected/6df3f631-039c-4df3-a991-9775663959e3-kube-api-access-hs6mr\") pod \"nmstate-metrics-7f946cbc9-nzkjk\" (UID: \"6df3f631-039c-4df3-a991-9775663959e3\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nzkjk" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.329041 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-fhw4s"] Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.329779 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.346410 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z"] Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.423729 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkxzn\" (UniqueName: \"kubernetes.io/projected/dc5309cc-3e62-4f06-94ac-c7b938ff5373-kube-api-access-qkxzn\") pod \"nmstate-webhook-5f6d4c5ccb-4jr5z\" (UID: \"dc5309cc-3e62-4f06-94ac-c7b938ff5373\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.423788 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zzhl\" (UniqueName: \"kubernetes.io/projected/5e13deba-1699-48ea-9085-425e98206f8d-kube-api-access-6zzhl\") pod \"nmstate-handler-fhw4s\" (UID: \"5e13deba-1699-48ea-9085-425e98206f8d\") " pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.423817 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5e13deba-1699-48ea-9085-425e98206f8d-dbus-socket\") pod \"nmstate-handler-fhw4s\" (UID: \"5e13deba-1699-48ea-9085-425e98206f8d\") " pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.423869 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5e13deba-1699-48ea-9085-425e98206f8d-nmstate-lock\") pod \"nmstate-handler-fhw4s\" (UID: \"5e13deba-1699-48ea-9085-425e98206f8d\") " pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.423895 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dc5309cc-3e62-4f06-94ac-c7b938ff5373-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-4jr5z\" (UID: \"dc5309cc-3e62-4f06-94ac-c7b938ff5373\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.423924 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs6mr\" (UniqueName: \"kubernetes.io/projected/6df3f631-039c-4df3-a991-9775663959e3-kube-api-access-hs6mr\") pod \"nmstate-metrics-7f946cbc9-nzkjk\" (UID: \"6df3f631-039c-4df3-a991-9775663959e3\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nzkjk" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.423964 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5e13deba-1699-48ea-9085-425e98206f8d-ovs-socket\") pod \"nmstate-handler-fhw4s\" (UID: \"5e13deba-1699-48ea-9085-425e98206f8d\") " pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.426426 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs"] Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.427051 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.428913 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.429248 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.429592 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-m5nvg" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.436569 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs"] Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.450627 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs6mr\" (UniqueName: \"kubernetes.io/projected/6df3f631-039c-4df3-a991-9775663959e3-kube-api-access-hs6mr\") pod \"nmstate-metrics-7f946cbc9-nzkjk\" (UID: \"6df3f631-039c-4df3-a991-9775663959e3\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nzkjk" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.526410 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2e18700-c2eb-4cae-8be4-4463b8a8071a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-4vdgs\" (UID: \"b2e18700-c2eb-4cae-8be4-4463b8a8071a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.526493 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkxzn\" (UniqueName: \"kubernetes.io/projected/dc5309cc-3e62-4f06-94ac-c7b938ff5373-kube-api-access-qkxzn\") pod \"nmstate-webhook-5f6d4c5ccb-4jr5z\" (UID: \"dc5309cc-3e62-4f06-94ac-c7b938ff5373\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.526529 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zzhl\" (UniqueName: \"kubernetes.io/projected/5e13deba-1699-48ea-9085-425e98206f8d-kube-api-access-6zzhl\") pod \"nmstate-handler-fhw4s\" (UID: \"5e13deba-1699-48ea-9085-425e98206f8d\") " pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.526553 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5e13deba-1699-48ea-9085-425e98206f8d-dbus-socket\") pod \"nmstate-handler-fhw4s\" (UID: \"5e13deba-1699-48ea-9085-425e98206f8d\") " pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.526607 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dc5309cc-3e62-4f06-94ac-c7b938ff5373-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-4jr5z\" (UID: \"dc5309cc-3e62-4f06-94ac-c7b938ff5373\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.526626 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5e13deba-1699-48ea-9085-425e98206f8d-nmstate-lock\") pod \"nmstate-handler-fhw4s\" (UID: \"5e13deba-1699-48ea-9085-425e98206f8d\") " pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.526673 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5e13deba-1699-48ea-9085-425e98206f8d-ovs-socket\") pod \"nmstate-handler-fhw4s\" (UID: \"5e13deba-1699-48ea-9085-425e98206f8d\") " pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.526700 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b2e18700-c2eb-4cae-8be4-4463b8a8071a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-4vdgs\" (UID: \"b2e18700-c2eb-4cae-8be4-4463b8a8071a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.526728 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrnjs\" (UniqueName: \"kubernetes.io/projected/b2e18700-c2eb-4cae-8be4-4463b8a8071a-kube-api-access-lrnjs\") pod \"nmstate-console-plugin-7fbb5f6569-4vdgs\" (UID: \"b2e18700-c2eb-4cae-8be4-4463b8a8071a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.532635 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5e13deba-1699-48ea-9085-425e98206f8d-nmstate-lock\") pod \"nmstate-handler-fhw4s\" (UID: \"5e13deba-1699-48ea-9085-425e98206f8d\") " pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.533243 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5e13deba-1699-48ea-9085-425e98206f8d-dbus-socket\") pod \"nmstate-handler-fhw4s\" (UID: \"5e13deba-1699-48ea-9085-425e98206f8d\") " pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.533807 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5e13deba-1699-48ea-9085-425e98206f8d-ovs-socket\") pod \"nmstate-handler-fhw4s\" (UID: \"5e13deba-1699-48ea-9085-425e98206f8d\") " pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.543567 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dc5309cc-3e62-4f06-94ac-c7b938ff5373-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-4jr5z\" (UID: \"dc5309cc-3e62-4f06-94ac-c7b938ff5373\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.565302 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkxzn\" (UniqueName: \"kubernetes.io/projected/dc5309cc-3e62-4f06-94ac-c7b938ff5373-kube-api-access-qkxzn\") pod \"nmstate-webhook-5f6d4c5ccb-4jr5z\" (UID: \"dc5309cc-3e62-4f06-94ac-c7b938ff5373\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.570090 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zzhl\" (UniqueName: \"kubernetes.io/projected/5e13deba-1699-48ea-9085-425e98206f8d-kube-api-access-6zzhl\") pod \"nmstate-handler-fhw4s\" (UID: \"5e13deba-1699-48ea-9085-425e98206f8d\") " pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.597589 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nzkjk" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.627905 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrnjs\" (UniqueName: \"kubernetes.io/projected/b2e18700-c2eb-4cae-8be4-4463b8a8071a-kube-api-access-lrnjs\") pod \"nmstate-console-plugin-7fbb5f6569-4vdgs\" (UID: \"b2e18700-c2eb-4cae-8be4-4463b8a8071a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.627978 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2e18700-c2eb-4cae-8be4-4463b8a8071a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-4vdgs\" (UID: \"b2e18700-c2eb-4cae-8be4-4463b8a8071a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.628062 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b2e18700-c2eb-4cae-8be4-4463b8a8071a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-4vdgs\" (UID: \"b2e18700-c2eb-4cae-8be4-4463b8a8071a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.629044 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b2e18700-c2eb-4cae-8be4-4463b8a8071a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-4vdgs\" (UID: \"b2e18700-c2eb-4cae-8be4-4463b8a8071a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.631101 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.644336 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2e18700-c2eb-4cae-8be4-4463b8a8071a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-4vdgs\" (UID: \"b2e18700-c2eb-4cae-8be4-4463b8a8071a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.650388 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.650407 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrnjs\" (UniqueName: \"kubernetes.io/projected/b2e18700-c2eb-4cae-8be4-4463b8a8071a-kube-api-access-lrnjs\") pod \"nmstate-console-plugin-7fbb5f6569-4vdgs\" (UID: \"b2e18700-c2eb-4cae-8be4-4463b8a8071a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.675351 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-d675d5484-zjxxm"] Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.676054 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: W1205 20:23:47.686432 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e13deba_1699_48ea_9085_425e98206f8d.slice/crio-8dd59b5e5d45ce4d35b522fcfed1a3b2c8368314785e177e3dca8890d572b50d WatchSource:0}: Error finding container 8dd59b5e5d45ce4d35b522fcfed1a3b2c8368314785e177e3dca8890d572b50d: Status 404 returned error can't find the container with id 8dd59b5e5d45ce4d35b522fcfed1a3b2c8368314785e177e3dca8890d572b50d Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.697632 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d675d5484-zjxxm"] Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.734534 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l4b9\" (UniqueName: \"kubernetes.io/projected/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-kube-api-access-4l4b9\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.734852 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-oauth-config\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.734872 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-oauth-serving-cert\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.734890 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-serving-cert\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.734921 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-trusted-ca-bundle\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.734948 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-config\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.734982 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-service-ca\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.747582 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.836191 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l4b9\" (UniqueName: \"kubernetes.io/projected/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-kube-api-access-4l4b9\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.836258 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-oauth-config\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.836284 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-oauth-serving-cert\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.836332 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-serving-cert\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.836377 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-trusted-ca-bundle\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.836420 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-config\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.836465 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-service-ca\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.837479 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-service-ca\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.838097 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-oauth-serving-cert\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.838133 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-config\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.838620 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-trusted-ca-bundle\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.847987 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-serving-cert\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.854405 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-oauth-config\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.865575 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l4b9\" (UniqueName: \"kubernetes.io/projected/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-kube-api-access-4l4b9\") pod \"console-d675d5484-zjxxm\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.912588 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-nzkjk"] Dec 05 20:23:47 crc kubenswrapper[4744]: W1205 20:23:47.914582 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6df3f631_039c_4df3_a991_9775663959e3.slice/crio-716456be7569eb17b51acb62c442cc19359ef3f693f292e44e811149be74d99c WatchSource:0}: Error finding container 716456be7569eb17b51acb62c442cc19359ef3f693f292e44e811149be74d99c: Status 404 returned error can't find the container with id 716456be7569eb17b51acb62c442cc19359ef3f693f292e44e811149be74d99c Dec 05 20:23:47 crc kubenswrapper[4744]: I1205 20:23:47.942670 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z"] Dec 05 20:23:47 crc kubenswrapper[4744]: W1205 20:23:47.952730 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc5309cc_3e62_4f06_94ac_c7b938ff5373.slice/crio-0db76b69626e419ca942f7cf7493dea7fd633331220d75b1460d180a0a023216 WatchSource:0}: Error finding container 0db76b69626e419ca942f7cf7493dea7fd633331220d75b1460d180a0a023216: Status 404 returned error can't find the container with id 0db76b69626e419ca942f7cf7493dea7fd633331220d75b1460d180a0a023216 Dec 05 20:23:48 crc kubenswrapper[4744]: I1205 20:23:48.004578 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:48 crc kubenswrapper[4744]: I1205 20:23:48.020762 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs"] Dec 05 20:23:48 crc kubenswrapper[4744]: I1205 20:23:48.103963 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="378c2d85-6854-4d11-b648-dd6766a8453e" path="/var/lib/kubelet/pods/378c2d85-6854-4d11-b648-dd6766a8453e/volumes" Dec 05 20:23:48 crc kubenswrapper[4744]: I1205 20:23:48.213634 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d675d5484-zjxxm"] Dec 05 20:23:48 crc kubenswrapper[4744]: W1205 20:23:48.216686 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod203cad7e_b0d1_4c93_b217_d7cf0c6f1c5d.slice/crio-b0f89f9f5aac0ccf95aa4780e6b03abebe337025c88d1f7b5c6932b72ca3bd43 WatchSource:0}: Error finding container b0f89f9f5aac0ccf95aa4780e6b03abebe337025c88d1f7b5c6932b72ca3bd43: Status 404 returned error can't find the container with id b0f89f9f5aac0ccf95aa4780e6b03abebe337025c88d1f7b5c6932b72ca3bd43 Dec 05 20:23:48 crc kubenswrapper[4744]: I1205 20:23:48.382464 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z" event={"ID":"dc5309cc-3e62-4f06-94ac-c7b938ff5373","Type":"ContainerStarted","Data":"0db76b69626e419ca942f7cf7493dea7fd633331220d75b1460d180a0a023216"} Dec 05 20:23:48 crc kubenswrapper[4744]: I1205 20:23:48.383751 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs" event={"ID":"b2e18700-c2eb-4cae-8be4-4463b8a8071a","Type":"ContainerStarted","Data":"e657850d7e7c4815576ce0c04ddea6598aa25bf92c0b250f3b084a7ed3040a33"} Dec 05 20:23:48 crc kubenswrapper[4744]: I1205 20:23:48.385284 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nzkjk" event={"ID":"6df3f631-039c-4df3-a991-9775663959e3","Type":"ContainerStarted","Data":"716456be7569eb17b51acb62c442cc19359ef3f693f292e44e811149be74d99c"} Dec 05 20:23:48 crc kubenswrapper[4744]: I1205 20:23:48.386864 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d675d5484-zjxxm" event={"ID":"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d","Type":"ContainerStarted","Data":"fed1f2eaf2d53105dc8a5256cf9b31f5d69099d43244e82515aa6bfd6cfcb5f9"} Dec 05 20:23:48 crc kubenswrapper[4744]: I1205 20:23:48.386894 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d675d5484-zjxxm" event={"ID":"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d","Type":"ContainerStarted","Data":"b0f89f9f5aac0ccf95aa4780e6b03abebe337025c88d1f7b5c6932b72ca3bd43"} Dec 05 20:23:48 crc kubenswrapper[4744]: I1205 20:23:48.387900 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fhw4s" event={"ID":"5e13deba-1699-48ea-9085-425e98206f8d","Type":"ContainerStarted","Data":"8dd59b5e5d45ce4d35b522fcfed1a3b2c8368314785e177e3dca8890d572b50d"} Dec 05 20:23:48 crc kubenswrapper[4744]: I1205 20:23:48.405658 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d675d5484-zjxxm" podStartSLOduration=1.40564324 podStartE2EDuration="1.40564324s" podCreationTimestamp="2025-12-05 20:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:23:48.403587867 +0000 UTC m=+798.633399235" watchObservedRunningTime="2025-12-05 20:23:48.40564324 +0000 UTC m=+798.635454598" Dec 05 20:23:49 crc kubenswrapper[4744]: I1205 20:23:49.806524 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:23:49 crc kubenswrapper[4744]: I1205 20:23:49.806888 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:23:51 crc kubenswrapper[4744]: I1205 20:23:51.409531 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fhw4s" event={"ID":"5e13deba-1699-48ea-9085-425e98206f8d","Type":"ContainerStarted","Data":"e80630a8d2c9c3d6e68401060ba677e92e9fb88fe496c56cac82d0c1989682d4"} Dec 05 20:23:51 crc kubenswrapper[4744]: I1205 20:23:51.410242 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:51 crc kubenswrapper[4744]: I1205 20:23:51.416409 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z" event={"ID":"dc5309cc-3e62-4f06-94ac-c7b938ff5373","Type":"ContainerStarted","Data":"5d54dcc56c73983c02bdf334ac598f53194c63700cb80f8fd45ceb9442823c6f"} Dec 05 20:23:51 crc kubenswrapper[4744]: I1205 20:23:51.416624 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z" Dec 05 20:23:51 crc kubenswrapper[4744]: I1205 20:23:51.417624 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs" event={"ID":"b2e18700-c2eb-4cae-8be4-4463b8a8071a","Type":"ContainerStarted","Data":"73ea5fb675bfe7c6278f1a173b8e70ab1802bbaaef2d44037d1c1e1679312af3"} Dec 05 20:23:51 crc kubenswrapper[4744]: I1205 20:23:51.427786 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-fhw4s" podStartSLOduration=1.240516517 podStartE2EDuration="4.4277707s" podCreationTimestamp="2025-12-05 20:23:47 +0000 UTC" firstStartedPulling="2025-12-05 20:23:47.700457995 +0000 UTC m=+797.930269363" lastFinishedPulling="2025-12-05 20:23:50.887712178 +0000 UTC m=+801.117523546" observedRunningTime="2025-12-05 20:23:51.422398644 +0000 UTC m=+801.652210012" watchObservedRunningTime="2025-12-05 20:23:51.4277707 +0000 UTC m=+801.657582068" Dec 05 20:23:51 crc kubenswrapper[4744]: I1205 20:23:51.441362 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z" podStartSLOduration=1.527658238 podStartE2EDuration="4.441337381s" podCreationTimestamp="2025-12-05 20:23:47 +0000 UTC" firstStartedPulling="2025-12-05 20:23:47.95556042 +0000 UTC m=+798.185371788" lastFinishedPulling="2025-12-05 20:23:50.869239563 +0000 UTC m=+801.099050931" observedRunningTime="2025-12-05 20:23:51.43733481 +0000 UTC m=+801.667146198" watchObservedRunningTime="2025-12-05 20:23:51.441337381 +0000 UTC m=+801.671148759" Dec 05 20:23:51 crc kubenswrapper[4744]: I1205 20:23:51.460921 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vdgs" podStartSLOduration=1.6477042370000001 podStartE2EDuration="4.460905153s" podCreationTimestamp="2025-12-05 20:23:47 +0000 UTC" firstStartedPulling="2025-12-05 20:23:48.056107099 +0000 UTC m=+798.285918467" lastFinishedPulling="2025-12-05 20:23:50.869308025 +0000 UTC m=+801.099119383" observedRunningTime="2025-12-05 20:23:51.46076181 +0000 UTC m=+801.690573198" watchObservedRunningTime="2025-12-05 20:23:51.460905153 +0000 UTC m=+801.690716521" Dec 05 20:23:53 crc kubenswrapper[4744]: I1205 20:23:53.430224 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nzkjk" event={"ID":"6df3f631-039c-4df3-a991-9775663959e3","Type":"ContainerStarted","Data":"acb94b637ce1cbacc90149377ea66e73b35f6385acb05a227382d8323b7f4cae"} Dec 05 20:23:55 crc kubenswrapper[4744]: I1205 20:23:55.449607 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nzkjk" event={"ID":"6df3f631-039c-4df3-a991-9775663959e3","Type":"ContainerStarted","Data":"93b175a946d29a4793e02b4c433fdc2adf4b6450cb395f7b2fee6a54b716c351"} Dec 05 20:23:55 crc kubenswrapper[4744]: I1205 20:23:55.471473 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-nzkjk" podStartSLOduration=1.098555937 podStartE2EDuration="8.4714559s" podCreationTimestamp="2025-12-05 20:23:47 +0000 UTC" firstStartedPulling="2025-12-05 20:23:47.917794451 +0000 UTC m=+798.147605819" lastFinishedPulling="2025-12-05 20:23:55.290694414 +0000 UTC m=+805.520505782" observedRunningTime="2025-12-05 20:23:55.466909566 +0000 UTC m=+805.696720954" watchObservedRunningTime="2025-12-05 20:23:55.4714559 +0000 UTC m=+805.701267258" Dec 05 20:23:57 crc kubenswrapper[4744]: I1205 20:23:57.672586 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-fhw4s" Dec 05 20:23:58 crc kubenswrapper[4744]: I1205 20:23:58.005350 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:58 crc kubenswrapper[4744]: I1205 20:23:58.005410 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:58 crc kubenswrapper[4744]: I1205 20:23:58.010054 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:58 crc kubenswrapper[4744]: I1205 20:23:58.475414 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:23:58 crc kubenswrapper[4744]: I1205 20:23:58.528097 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lgg2b"] Dec 05 20:24:07 crc kubenswrapper[4744]: I1205 20:24:07.636650 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4jr5z" Dec 05 20:24:19 crc kubenswrapper[4744]: I1205 20:24:19.807540 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:24:19 crc kubenswrapper[4744]: I1205 20:24:19.808100 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:24:22 crc kubenswrapper[4744]: I1205 20:24:22.620647 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv"] Dec 05 20:24:22 crc kubenswrapper[4744]: I1205 20:24:22.622611 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" Dec 05 20:24:22 crc kubenswrapper[4744]: I1205 20:24:22.627191 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 20:24:22 crc kubenswrapper[4744]: I1205 20:24:22.628693 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv"] Dec 05 20:24:22 crc kubenswrapper[4744]: I1205 20:24:22.743203 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c2bb82a-ab47-491c-8379-7204ae825090-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv\" (UID: \"9c2bb82a-ab47-491c-8379-7204ae825090\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" Dec 05 20:24:22 crc kubenswrapper[4744]: I1205 20:24:22.743490 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c2bb82a-ab47-491c-8379-7204ae825090-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv\" (UID: \"9c2bb82a-ab47-491c-8379-7204ae825090\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" Dec 05 20:24:22 crc kubenswrapper[4744]: I1205 20:24:22.743613 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzjhv\" (UniqueName: \"kubernetes.io/projected/9c2bb82a-ab47-491c-8379-7204ae825090-kube-api-access-xzjhv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv\" (UID: \"9c2bb82a-ab47-491c-8379-7204ae825090\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" Dec 05 20:24:22 crc kubenswrapper[4744]: I1205 20:24:22.844699 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c2bb82a-ab47-491c-8379-7204ae825090-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv\" (UID: \"9c2bb82a-ab47-491c-8379-7204ae825090\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" Dec 05 20:24:22 crc kubenswrapper[4744]: I1205 20:24:22.844766 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzjhv\" (UniqueName: \"kubernetes.io/projected/9c2bb82a-ab47-491c-8379-7204ae825090-kube-api-access-xzjhv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv\" (UID: \"9c2bb82a-ab47-491c-8379-7204ae825090\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" Dec 05 20:24:22 crc kubenswrapper[4744]: I1205 20:24:22.844817 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c2bb82a-ab47-491c-8379-7204ae825090-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv\" (UID: \"9c2bb82a-ab47-491c-8379-7204ae825090\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" Dec 05 20:24:22 crc kubenswrapper[4744]: I1205 20:24:22.845192 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c2bb82a-ab47-491c-8379-7204ae825090-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv\" (UID: \"9c2bb82a-ab47-491c-8379-7204ae825090\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" Dec 05 20:24:22 crc kubenswrapper[4744]: I1205 20:24:22.845208 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c2bb82a-ab47-491c-8379-7204ae825090-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv\" (UID: \"9c2bb82a-ab47-491c-8379-7204ae825090\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" Dec 05 20:24:22 crc kubenswrapper[4744]: I1205 20:24:22.866790 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzjhv\" (UniqueName: \"kubernetes.io/projected/9c2bb82a-ab47-491c-8379-7204ae825090-kube-api-access-xzjhv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv\" (UID: \"9c2bb82a-ab47-491c-8379-7204ae825090\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" Dec 05 20:24:22 crc kubenswrapper[4744]: I1205 20:24:22.957795 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" Dec 05 20:24:23 crc kubenswrapper[4744]: I1205 20:24:23.356554 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv"] Dec 05 20:24:23 crc kubenswrapper[4744]: I1205 20:24:23.571267 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-lgg2b" podUID="55e3c0e4-3a89-48b0-a218-f89546c09a5d" containerName="console" containerID="cri-o://7105d8472e4363e2fc82c8c4b5bfae502cd8a4cad41e5e729eabbf09a2090d86" gracePeriod=15 Dec 05 20:24:23 crc kubenswrapper[4744]: I1205 20:24:23.619322 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" event={"ID":"9c2bb82a-ab47-491c-8379-7204ae825090","Type":"ContainerStarted","Data":"2dc29c15602997f5e3c08d590847e6acef86ad1dc3e5d3497a7d667c791e83f7"} Dec 05 20:24:26 crc kubenswrapper[4744]: I1205 20:24:26.643569 4744 generic.go:334] "Generic (PLEG): container finished" podID="9c2bb82a-ab47-491c-8379-7204ae825090" containerID="be20e9b97d6667b50e7fa3d4be2e097b5770e50e366c37c793843ffec7c21f09" exitCode=0 Dec 05 20:24:26 crc kubenswrapper[4744]: I1205 20:24:26.643659 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" event={"ID":"9c2bb82a-ab47-491c-8379-7204ae825090","Type":"ContainerDied","Data":"be20e9b97d6667b50e7fa3d4be2e097b5770e50e366c37c793843ffec7c21f09"} Dec 05 20:24:26 crc kubenswrapper[4744]: I1205 20:24:26.655419 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lgg2b_55e3c0e4-3a89-48b0-a218-f89546c09a5d/console/0.log" Dec 05 20:24:26 crc kubenswrapper[4744]: I1205 20:24:26.655474 4744 generic.go:334] "Generic (PLEG): container finished" podID="55e3c0e4-3a89-48b0-a218-f89546c09a5d" containerID="7105d8472e4363e2fc82c8c4b5bfae502cd8a4cad41e5e729eabbf09a2090d86" exitCode=2 Dec 05 20:24:26 crc kubenswrapper[4744]: I1205 20:24:26.655538 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lgg2b" event={"ID":"55e3c0e4-3a89-48b0-a218-f89546c09a5d","Type":"ContainerDied","Data":"7105d8472e4363e2fc82c8c4b5bfae502cd8a4cad41e5e729eabbf09a2090d86"} Dec 05 20:24:28 crc kubenswrapper[4744]: I1205 20:24:28.295752 4744 patch_prober.go:28] interesting pod/console-f9d7485db-lgg2b container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/health\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Dec 05 20:24:28 crc kubenswrapper[4744]: I1205 20:24:28.296222 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-lgg2b" podUID="55e3c0e4-3a89-48b0-a218-f89546c09a5d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.31:8443/health\": dial tcp 10.217.0.31:8443: connect: connection refused" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.085680 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lgg2b_55e3c0e4-3a89-48b0-a218-f89546c09a5d/console/0.log" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.086002 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.145070 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-service-ca\") pod \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.145151 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-config\") pod \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.145185 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-trusted-ca-bundle\") pod \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.145212 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-oauth-config\") pod \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.145276 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-serving-cert\") pod \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.145319 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-oauth-serving-cert\") pod \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.145373 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2qzb\" (UniqueName: \"kubernetes.io/projected/55e3c0e4-3a89-48b0-a218-f89546c09a5d-kube-api-access-d2qzb\") pod \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\" (UID: \"55e3c0e4-3a89-48b0-a218-f89546c09a5d\") " Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.145918 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-service-ca" (OuterVolumeSpecName: "service-ca") pod "55e3c0e4-3a89-48b0-a218-f89546c09a5d" (UID: "55e3c0e4-3a89-48b0-a218-f89546c09a5d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.145935 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-config" (OuterVolumeSpecName: "console-config") pod "55e3c0e4-3a89-48b0-a218-f89546c09a5d" (UID: "55e3c0e4-3a89-48b0-a218-f89546c09a5d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.145949 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "55e3c0e4-3a89-48b0-a218-f89546c09a5d" (UID: "55e3c0e4-3a89-48b0-a218-f89546c09a5d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.145961 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "55e3c0e4-3a89-48b0-a218-f89546c09a5d" (UID: "55e3c0e4-3a89-48b0-a218-f89546c09a5d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.151856 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e3c0e4-3a89-48b0-a218-f89546c09a5d-kube-api-access-d2qzb" (OuterVolumeSpecName: "kube-api-access-d2qzb") pod "55e3c0e4-3a89-48b0-a218-f89546c09a5d" (UID: "55e3c0e4-3a89-48b0-a218-f89546c09a5d"). InnerVolumeSpecName "kube-api-access-d2qzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.151903 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "55e3c0e4-3a89-48b0-a218-f89546c09a5d" (UID: "55e3c0e4-3a89-48b0-a218-f89546c09a5d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.151972 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "55e3c0e4-3a89-48b0-a218-f89546c09a5d" (UID: "55e3c0e4-3a89-48b0-a218-f89546c09a5d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.246552 4744 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.246591 4744 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.246610 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2qzb\" (UniqueName: \"kubernetes.io/projected/55e3c0e4-3a89-48b0-a218-f89546c09a5d-kube-api-access-d2qzb\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.246622 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.246632 4744 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.246642 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55e3c0e4-3a89-48b0-a218-f89546c09a5d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.246652 4744 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55e3c0e4-3a89-48b0-a218-f89546c09a5d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.673761 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lgg2b_55e3c0e4-3a89-48b0-a218-f89546c09a5d/console/0.log" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.673829 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lgg2b" event={"ID":"55e3c0e4-3a89-48b0-a218-f89546c09a5d","Type":"ContainerDied","Data":"660fac3d7a65e1f8968356f827b875088609e7e24c40cc6c4dd47da4d7ed80c7"} Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.673877 4744 scope.go:117] "RemoveContainer" containerID="7105d8472e4363e2fc82c8c4b5bfae502cd8a4cad41e5e729eabbf09a2090d86" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.673930 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lgg2b" Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.708893 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lgg2b"] Dec 05 20:24:29 crc kubenswrapper[4744]: I1205 20:24:29.714602 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-lgg2b"] Dec 05 20:24:30 crc kubenswrapper[4744]: I1205 20:24:30.087061 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e3c0e4-3a89-48b0-a218-f89546c09a5d" path="/var/lib/kubelet/pods/55e3c0e4-3a89-48b0-a218-f89546c09a5d/volumes" Dec 05 20:24:32 crc kubenswrapper[4744]: I1205 20:24:32.693641 4744 generic.go:334] "Generic (PLEG): container finished" podID="9c2bb82a-ab47-491c-8379-7204ae825090" containerID="047ed62f8c01a17196250d02fc41e801d2acaeab00760d58095e25ed2cba8204" exitCode=0 Dec 05 20:24:32 crc kubenswrapper[4744]: I1205 20:24:32.693711 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" event={"ID":"9c2bb82a-ab47-491c-8379-7204ae825090","Type":"ContainerDied","Data":"047ed62f8c01a17196250d02fc41e801d2acaeab00760d58095e25ed2cba8204"} Dec 05 20:24:33 crc kubenswrapper[4744]: I1205 20:24:33.711455 4744 generic.go:334] "Generic (PLEG): container finished" podID="9c2bb82a-ab47-491c-8379-7204ae825090" containerID="03841d6c8aa4837619b24a1ff9f7f9ef2401d45df0bfd771d24a852aa225efee" exitCode=0 Dec 05 20:24:33 crc kubenswrapper[4744]: I1205 20:24:33.711550 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" event={"ID":"9c2bb82a-ab47-491c-8379-7204ae825090","Type":"ContainerDied","Data":"03841d6c8aa4837619b24a1ff9f7f9ef2401d45df0bfd771d24a852aa225efee"} Dec 05 20:24:34 crc kubenswrapper[4744]: I1205 20:24:34.960797 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" Dec 05 20:24:35 crc kubenswrapper[4744]: I1205 20:24:35.026199 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c2bb82a-ab47-491c-8379-7204ae825090-bundle\") pod \"9c2bb82a-ab47-491c-8379-7204ae825090\" (UID: \"9c2bb82a-ab47-491c-8379-7204ae825090\") " Dec 05 20:24:35 crc kubenswrapper[4744]: I1205 20:24:35.026309 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c2bb82a-ab47-491c-8379-7204ae825090-util\") pod \"9c2bb82a-ab47-491c-8379-7204ae825090\" (UID: \"9c2bb82a-ab47-491c-8379-7204ae825090\") " Dec 05 20:24:35 crc kubenswrapper[4744]: I1205 20:24:35.026407 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzjhv\" (UniqueName: \"kubernetes.io/projected/9c2bb82a-ab47-491c-8379-7204ae825090-kube-api-access-xzjhv\") pod \"9c2bb82a-ab47-491c-8379-7204ae825090\" (UID: \"9c2bb82a-ab47-491c-8379-7204ae825090\") " Dec 05 20:24:35 crc kubenswrapper[4744]: I1205 20:24:35.027888 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c2bb82a-ab47-491c-8379-7204ae825090-bundle" (OuterVolumeSpecName: "bundle") pod "9c2bb82a-ab47-491c-8379-7204ae825090" (UID: "9c2bb82a-ab47-491c-8379-7204ae825090"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:24:35 crc kubenswrapper[4744]: I1205 20:24:35.033308 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c2bb82a-ab47-491c-8379-7204ae825090-kube-api-access-xzjhv" (OuterVolumeSpecName: "kube-api-access-xzjhv") pod "9c2bb82a-ab47-491c-8379-7204ae825090" (UID: "9c2bb82a-ab47-491c-8379-7204ae825090"). InnerVolumeSpecName "kube-api-access-xzjhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:24:35 crc kubenswrapper[4744]: I1205 20:24:35.038368 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c2bb82a-ab47-491c-8379-7204ae825090-util" (OuterVolumeSpecName: "util") pod "9c2bb82a-ab47-491c-8379-7204ae825090" (UID: "9c2bb82a-ab47-491c-8379-7204ae825090"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:24:35 crc kubenswrapper[4744]: I1205 20:24:35.128477 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzjhv\" (UniqueName: \"kubernetes.io/projected/9c2bb82a-ab47-491c-8379-7204ae825090-kube-api-access-xzjhv\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:35 crc kubenswrapper[4744]: I1205 20:24:35.128533 4744 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c2bb82a-ab47-491c-8379-7204ae825090-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:35 crc kubenswrapper[4744]: I1205 20:24:35.128550 4744 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c2bb82a-ab47-491c-8379-7204ae825090-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:24:35 crc kubenswrapper[4744]: I1205 20:24:35.727475 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" event={"ID":"9c2bb82a-ab47-491c-8379-7204ae825090","Type":"ContainerDied","Data":"2dc29c15602997f5e3c08d590847e6acef86ad1dc3e5d3497a7d667c791e83f7"} Dec 05 20:24:35 crc kubenswrapper[4744]: I1205 20:24:35.727546 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv" Dec 05 20:24:35 crc kubenswrapper[4744]: I1205 20:24:35.727551 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dc29c15602997f5e3c08d590847e6acef86ad1dc3e5d3497a7d667c791e83f7" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.369036 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf"] Dec 05 20:24:48 crc kubenswrapper[4744]: E1205 20:24:48.369818 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c2bb82a-ab47-491c-8379-7204ae825090" containerName="pull" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.369833 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2bb82a-ab47-491c-8379-7204ae825090" containerName="pull" Dec 05 20:24:48 crc kubenswrapper[4744]: E1205 20:24:48.369853 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e3c0e4-3a89-48b0-a218-f89546c09a5d" containerName="console" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.369861 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e3c0e4-3a89-48b0-a218-f89546c09a5d" containerName="console" Dec 05 20:24:48 crc kubenswrapper[4744]: E1205 20:24:48.369871 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c2bb82a-ab47-491c-8379-7204ae825090" containerName="extract" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.369878 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2bb82a-ab47-491c-8379-7204ae825090" containerName="extract" Dec 05 20:24:48 crc kubenswrapper[4744]: E1205 20:24:48.369899 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c2bb82a-ab47-491c-8379-7204ae825090" containerName="util" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.369906 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2bb82a-ab47-491c-8379-7204ae825090" containerName="util" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.370016 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c2bb82a-ab47-491c-8379-7204ae825090" containerName="extract" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.370030 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e3c0e4-3a89-48b0-a218-f89546c09a5d" containerName="console" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.370526 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.372380 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.372685 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.372741 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.372769 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.373630 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-hgxmv" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.384678 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf"] Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.537681 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42d4z\" (UniqueName: \"kubernetes.io/projected/c15b5414-dbb3-461e-9108-26f514628d7b-kube-api-access-42d4z\") pod \"metallb-operator-controller-manager-579c6fcd5-kp2mf\" (UID: \"c15b5414-dbb3-461e-9108-26f514628d7b\") " pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.537998 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c15b5414-dbb3-461e-9108-26f514628d7b-apiservice-cert\") pod \"metallb-operator-controller-manager-579c6fcd5-kp2mf\" (UID: \"c15b5414-dbb3-461e-9108-26f514628d7b\") " pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.538062 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c15b5414-dbb3-461e-9108-26f514628d7b-webhook-cert\") pod \"metallb-operator-controller-manager-579c6fcd5-kp2mf\" (UID: \"c15b5414-dbb3-461e-9108-26f514628d7b\") " pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.639214 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c15b5414-dbb3-461e-9108-26f514628d7b-webhook-cert\") pod \"metallb-operator-controller-manager-579c6fcd5-kp2mf\" (UID: \"c15b5414-dbb3-461e-9108-26f514628d7b\") " pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.639319 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42d4z\" (UniqueName: \"kubernetes.io/projected/c15b5414-dbb3-461e-9108-26f514628d7b-kube-api-access-42d4z\") pod \"metallb-operator-controller-manager-579c6fcd5-kp2mf\" (UID: \"c15b5414-dbb3-461e-9108-26f514628d7b\") " pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.639351 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c15b5414-dbb3-461e-9108-26f514628d7b-apiservice-cert\") pod \"metallb-operator-controller-manager-579c6fcd5-kp2mf\" (UID: \"c15b5414-dbb3-461e-9108-26f514628d7b\") " pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.659183 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c15b5414-dbb3-461e-9108-26f514628d7b-apiservice-cert\") pod \"metallb-operator-controller-manager-579c6fcd5-kp2mf\" (UID: \"c15b5414-dbb3-461e-9108-26f514628d7b\") " pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.666954 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c15b5414-dbb3-461e-9108-26f514628d7b-webhook-cert\") pod \"metallb-operator-controller-manager-579c6fcd5-kp2mf\" (UID: \"c15b5414-dbb3-461e-9108-26f514628d7b\") " pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.670341 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42d4z\" (UniqueName: \"kubernetes.io/projected/c15b5414-dbb3-461e-9108-26f514628d7b-kube-api-access-42d4z\") pod \"metallb-operator-controller-manager-579c6fcd5-kp2mf\" (UID: \"c15b5414-dbb3-461e-9108-26f514628d7b\") " pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.736945 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.803814 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc"] Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.804949 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.806658 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-t966b" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.807813 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.808034 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.818229 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc"] Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.943579 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70f8386c-29b0-4cc8-9d75-740e8796f01a-apiservice-cert\") pod \"metallb-operator-webhook-server-686966fbfb-zrgkc\" (UID: \"70f8386c-29b0-4cc8-9d75-740e8796f01a\") " pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.943646 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d8rz\" (UniqueName: \"kubernetes.io/projected/70f8386c-29b0-4cc8-9d75-740e8796f01a-kube-api-access-6d8rz\") pod \"metallb-operator-webhook-server-686966fbfb-zrgkc\" (UID: \"70f8386c-29b0-4cc8-9d75-740e8796f01a\") " pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.943672 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70f8386c-29b0-4cc8-9d75-740e8796f01a-webhook-cert\") pod \"metallb-operator-webhook-server-686966fbfb-zrgkc\" (UID: \"70f8386c-29b0-4cc8-9d75-740e8796f01a\") " pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" Dec 05 20:24:48 crc kubenswrapper[4744]: I1205 20:24:48.999740 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf"] Dec 05 20:24:49 crc kubenswrapper[4744]: I1205 20:24:49.046013 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70f8386c-29b0-4cc8-9d75-740e8796f01a-webhook-cert\") pod \"metallb-operator-webhook-server-686966fbfb-zrgkc\" (UID: \"70f8386c-29b0-4cc8-9d75-740e8796f01a\") " pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" Dec 05 20:24:49 crc kubenswrapper[4744]: I1205 20:24:49.048316 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70f8386c-29b0-4cc8-9d75-740e8796f01a-apiservice-cert\") pod \"metallb-operator-webhook-server-686966fbfb-zrgkc\" (UID: \"70f8386c-29b0-4cc8-9d75-740e8796f01a\") " pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" Dec 05 20:24:49 crc kubenswrapper[4744]: I1205 20:24:49.048373 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d8rz\" (UniqueName: \"kubernetes.io/projected/70f8386c-29b0-4cc8-9d75-740e8796f01a-kube-api-access-6d8rz\") pod \"metallb-operator-webhook-server-686966fbfb-zrgkc\" (UID: \"70f8386c-29b0-4cc8-9d75-740e8796f01a\") " pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" Dec 05 20:24:49 crc kubenswrapper[4744]: I1205 20:24:49.052195 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70f8386c-29b0-4cc8-9d75-740e8796f01a-webhook-cert\") pod \"metallb-operator-webhook-server-686966fbfb-zrgkc\" (UID: \"70f8386c-29b0-4cc8-9d75-740e8796f01a\") " pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" Dec 05 20:24:49 crc kubenswrapper[4744]: I1205 20:24:49.052198 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70f8386c-29b0-4cc8-9d75-740e8796f01a-apiservice-cert\") pod \"metallb-operator-webhook-server-686966fbfb-zrgkc\" (UID: \"70f8386c-29b0-4cc8-9d75-740e8796f01a\") " pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" Dec 05 20:24:49 crc kubenswrapper[4744]: I1205 20:24:49.063833 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d8rz\" (UniqueName: \"kubernetes.io/projected/70f8386c-29b0-4cc8-9d75-740e8796f01a-kube-api-access-6d8rz\") pod \"metallb-operator-webhook-server-686966fbfb-zrgkc\" (UID: \"70f8386c-29b0-4cc8-9d75-740e8796f01a\") " pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" Dec 05 20:24:49 crc kubenswrapper[4744]: I1205 20:24:49.120689 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" Dec 05 20:24:49 crc kubenswrapper[4744]: I1205 20:24:49.322427 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc"] Dec 05 20:24:49 crc kubenswrapper[4744]: W1205 20:24:49.335460 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70f8386c_29b0_4cc8_9d75_740e8796f01a.slice/crio-a2d5e825adc770ec90c8daa3fd25a63cec31d39a4d87ab789c32c2c0b533a621 WatchSource:0}: Error finding container a2d5e825adc770ec90c8daa3fd25a63cec31d39a4d87ab789c32c2c0b533a621: Status 404 returned error can't find the container with id a2d5e825adc770ec90c8daa3fd25a63cec31d39a4d87ab789c32c2c0b533a621 Dec 05 20:24:49 crc kubenswrapper[4744]: I1205 20:24:49.806742 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:24:49 crc kubenswrapper[4744]: I1205 20:24:49.807030 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:24:49 crc kubenswrapper[4744]: I1205 20:24:49.807078 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:24:49 crc kubenswrapper[4744]: I1205 20:24:49.807629 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fcebdaf5fbdada46a4c4fdee6dfda24df67a9ddab7d4a2219b461c1be76e2942"} pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:24:49 crc kubenswrapper[4744]: I1205 20:24:49.807679 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" containerID="cri-o://fcebdaf5fbdada46a4c4fdee6dfda24df67a9ddab7d4a2219b461c1be76e2942" gracePeriod=600 Dec 05 20:24:49 crc kubenswrapper[4744]: I1205 20:24:49.853923 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" event={"ID":"c15b5414-dbb3-461e-9108-26f514628d7b","Type":"ContainerStarted","Data":"46f1b2a51c92b2b2e1ee8b10dc59206ad27ddeb142ef245006491f8b507890bd"} Dec 05 20:24:49 crc kubenswrapper[4744]: I1205 20:24:49.854884 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" event={"ID":"70f8386c-29b0-4cc8-9d75-740e8796f01a","Type":"ContainerStarted","Data":"a2d5e825adc770ec90c8daa3fd25a63cec31d39a4d87ab789c32c2c0b533a621"} Dec 05 20:24:50 crc kubenswrapper[4744]: I1205 20:24:50.863153 4744 generic.go:334] "Generic (PLEG): container finished" podID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerID="fcebdaf5fbdada46a4c4fdee6dfda24df67a9ddab7d4a2219b461c1be76e2942" exitCode=0 Dec 05 20:24:50 crc kubenswrapper[4744]: I1205 20:24:50.863190 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerDied","Data":"fcebdaf5fbdada46a4c4fdee6dfda24df67a9ddab7d4a2219b461c1be76e2942"} Dec 05 20:24:50 crc kubenswrapper[4744]: I1205 20:24:50.863507 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerStarted","Data":"7361719f1aaa6a0025abf0bbccc7737602f9bbc3dfb06fc01d1de9cb17c502bc"} Dec 05 20:24:50 crc kubenswrapper[4744]: I1205 20:24:50.863532 4744 scope.go:117] "RemoveContainer" containerID="9b8507204d764e1fcbea9060b778025759252a229905c4a16f53f059b113aeda" Dec 05 20:24:54 crc kubenswrapper[4744]: I1205 20:24:54.897601 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" event={"ID":"c15b5414-dbb3-461e-9108-26f514628d7b","Type":"ContainerStarted","Data":"0fba5b564e9d1b455f665777d6ca529752fd84938ac6241533226890c3def104"} Dec 05 20:24:54 crc kubenswrapper[4744]: I1205 20:24:54.897964 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" Dec 05 20:24:54 crc kubenswrapper[4744]: I1205 20:24:54.899161 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" event={"ID":"70f8386c-29b0-4cc8-9d75-740e8796f01a","Type":"ContainerStarted","Data":"c6ef5f35980ab7b1b3617e1aced930f58c543be4849623a9bdfe77ef1b662bcb"} Dec 05 20:24:54 crc kubenswrapper[4744]: I1205 20:24:54.899379 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" Dec 05 20:24:54 crc kubenswrapper[4744]: I1205 20:24:54.918795 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" podStartSLOduration=1.581742143 podStartE2EDuration="6.918772178s" podCreationTimestamp="2025-12-05 20:24:48 +0000 UTC" firstStartedPulling="2025-12-05 20:24:49.007722037 +0000 UTC m=+859.237533395" lastFinishedPulling="2025-12-05 20:24:54.344752052 +0000 UTC m=+864.574563430" observedRunningTime="2025-12-05 20:24:54.915770172 +0000 UTC m=+865.145581540" watchObservedRunningTime="2025-12-05 20:24:54.918772178 +0000 UTC m=+865.148583546" Dec 05 20:24:54 crc kubenswrapper[4744]: I1205 20:24:54.938132 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" podStartSLOduration=1.915854605 podStartE2EDuration="6.938105834s" podCreationTimestamp="2025-12-05 20:24:48 +0000 UTC" firstStartedPulling="2025-12-05 20:24:49.338862515 +0000 UTC m=+859.568673883" lastFinishedPulling="2025-12-05 20:24:54.361113724 +0000 UTC m=+864.590925112" observedRunningTime="2025-12-05 20:24:54.933412816 +0000 UTC m=+865.163224194" watchObservedRunningTime="2025-12-05 20:24:54.938105834 +0000 UTC m=+865.167917202" Dec 05 20:25:09 crc kubenswrapper[4744]: I1205 20:25:09.125896 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-686966fbfb-zrgkc" Dec 05 20:25:26 crc kubenswrapper[4744]: I1205 20:25:26.416041 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-767nv"] Dec 05 20:25:26 crc kubenswrapper[4744]: I1205 20:25:26.417675 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:26 crc kubenswrapper[4744]: I1205 20:25:26.431572 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-767nv"] Dec 05 20:25:26 crc kubenswrapper[4744]: I1205 20:25:26.552238 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc188947-7bde-445e-8638-e23eaec30a29-utilities\") pod \"certified-operators-767nv\" (UID: \"bc188947-7bde-445e-8638-e23eaec30a29\") " pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:26 crc kubenswrapper[4744]: I1205 20:25:26.552330 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6th8h\" (UniqueName: \"kubernetes.io/projected/bc188947-7bde-445e-8638-e23eaec30a29-kube-api-access-6th8h\") pod \"certified-operators-767nv\" (UID: \"bc188947-7bde-445e-8638-e23eaec30a29\") " pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:26 crc kubenswrapper[4744]: I1205 20:25:26.552416 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc188947-7bde-445e-8638-e23eaec30a29-catalog-content\") pod \"certified-operators-767nv\" (UID: \"bc188947-7bde-445e-8638-e23eaec30a29\") " pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:26 crc kubenswrapper[4744]: I1205 20:25:26.653976 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc188947-7bde-445e-8638-e23eaec30a29-utilities\") pod \"certified-operators-767nv\" (UID: \"bc188947-7bde-445e-8638-e23eaec30a29\") " pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:26 crc kubenswrapper[4744]: I1205 20:25:26.654031 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6th8h\" (UniqueName: \"kubernetes.io/projected/bc188947-7bde-445e-8638-e23eaec30a29-kube-api-access-6th8h\") pod \"certified-operators-767nv\" (UID: \"bc188947-7bde-445e-8638-e23eaec30a29\") " pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:26 crc kubenswrapper[4744]: I1205 20:25:26.654061 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc188947-7bde-445e-8638-e23eaec30a29-catalog-content\") pod \"certified-operators-767nv\" (UID: \"bc188947-7bde-445e-8638-e23eaec30a29\") " pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:26 crc kubenswrapper[4744]: I1205 20:25:26.654674 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc188947-7bde-445e-8638-e23eaec30a29-catalog-content\") pod \"certified-operators-767nv\" (UID: \"bc188947-7bde-445e-8638-e23eaec30a29\") " pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:26 crc kubenswrapper[4744]: I1205 20:25:26.654742 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc188947-7bde-445e-8638-e23eaec30a29-utilities\") pod \"certified-operators-767nv\" (UID: \"bc188947-7bde-445e-8638-e23eaec30a29\") " pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:26 crc kubenswrapper[4744]: I1205 20:25:26.673458 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6th8h\" (UniqueName: \"kubernetes.io/projected/bc188947-7bde-445e-8638-e23eaec30a29-kube-api-access-6th8h\") pod \"certified-operators-767nv\" (UID: \"bc188947-7bde-445e-8638-e23eaec30a29\") " pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:26 crc kubenswrapper[4744]: I1205 20:25:26.735529 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:27 crc kubenswrapper[4744]: I1205 20:25:27.058270 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-767nv"] Dec 05 20:25:27 crc kubenswrapper[4744]: I1205 20:25:27.093117 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-767nv" event={"ID":"bc188947-7bde-445e-8638-e23eaec30a29","Type":"ContainerStarted","Data":"18dde0b9ff5f6486955e5fd2e69a90a73e1b45f27b8af77e54a4aa706de78f99"} Dec 05 20:25:28 crc kubenswrapper[4744]: I1205 20:25:28.102085 4744 generic.go:334] "Generic (PLEG): container finished" podID="bc188947-7bde-445e-8638-e23eaec30a29" containerID="26ca2b8ca676b2224d338fcbe397e40b2291952b50a67f3ee7f459aa1b16e297" exitCode=0 Dec 05 20:25:28 crc kubenswrapper[4744]: I1205 20:25:28.102178 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-767nv" event={"ID":"bc188947-7bde-445e-8638-e23eaec30a29","Type":"ContainerDied","Data":"26ca2b8ca676b2224d338fcbe397e40b2291952b50a67f3ee7f459aa1b16e297"} Dec 05 20:25:28 crc kubenswrapper[4744]: I1205 20:25:28.740754 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-579c6fcd5-kp2mf" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.111366 4744 generic.go:334] "Generic (PLEG): container finished" podID="bc188947-7bde-445e-8638-e23eaec30a29" containerID="6cc340403bbefcb6f00372b798b9fffd870513f10b9ba128952b0e5af0c1b8be" exitCode=0 Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.111414 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-767nv" event={"ID":"bc188947-7bde-445e-8638-e23eaec30a29","Type":"ContainerDied","Data":"6cc340403bbefcb6f00372b798b9fffd870513f10b9ba128952b0e5af0c1b8be"} Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.456908 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl"] Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.457988 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.460677 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-fclgd" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.465118 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-8gfzk"] Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.466396 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.467592 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.471525 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.471892 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.473778 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl"] Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.556488 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-qq6d7"] Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.557537 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qq6d7" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.563223 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.563483 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.563718 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-rmxmm" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.563783 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.570038 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-ztkrp"] Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.575583 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-ztkrp" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.594422 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.597347 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-ztkrp"] Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.598039 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2498f6fb-e1f7-481e-af17-1138c80628ae-frr-sockets\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.598069 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2498f6fb-e1f7-481e-af17-1138c80628ae-metrics\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.598103 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2498f6fb-e1f7-481e-af17-1138c80628ae-reloader\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.598131 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnzwf\" (UniqueName: \"kubernetes.io/projected/2498f6fb-e1f7-481e-af17-1138c80628ae-kube-api-access-nnzwf\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.598153 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52a85b81-11c7-4a3f-9a1d-9ffe9edaa447-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-84dtl\" (UID: \"52a85b81-11c7-4a3f-9a1d-9ffe9edaa447\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.598190 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2498f6fb-e1f7-481e-af17-1138c80628ae-metrics-certs\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.598212 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2498f6fb-e1f7-481e-af17-1138c80628ae-frr-conf\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.598235 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67mmf\" (UniqueName: \"kubernetes.io/projected/52a85b81-11c7-4a3f-9a1d-9ffe9edaa447-kube-api-access-67mmf\") pod \"frr-k8s-webhook-server-7fcb986d4-84dtl\" (UID: \"52a85b81-11c7-4a3f-9a1d-9ffe9edaa447\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.598254 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2498f6fb-e1f7-481e-af17-1138c80628ae-frr-startup\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.699954 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/934c31a1-c04b-42d9-be60-cdbf988913eb-memberlist\") pod \"speaker-qq6d7\" (UID: \"934c31a1-c04b-42d9-be60-cdbf988913eb\") " pod="metallb-system/speaker-qq6d7" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.700543 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/934c31a1-c04b-42d9-be60-cdbf988913eb-metallb-excludel2\") pod \"speaker-qq6d7\" (UID: \"934c31a1-c04b-42d9-be60-cdbf988913eb\") " pod="metallb-system/speaker-qq6d7" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.700702 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/934c31a1-c04b-42d9-be60-cdbf988913eb-metrics-certs\") pod \"speaker-qq6d7\" (UID: \"934c31a1-c04b-42d9-be60-cdbf988913eb\") " pod="metallb-system/speaker-qq6d7" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.700837 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6vv4\" (UniqueName: \"kubernetes.io/projected/934c31a1-c04b-42d9-be60-cdbf988913eb-kube-api-access-b6vv4\") pod \"speaker-qq6d7\" (UID: \"934c31a1-c04b-42d9-be60-cdbf988913eb\") " pod="metallb-system/speaker-qq6d7" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.700948 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2498f6fb-e1f7-481e-af17-1138c80628ae-frr-sockets\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.701032 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2498f6fb-e1f7-481e-af17-1138c80628ae-metrics\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.701131 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2498f6fb-e1f7-481e-af17-1138c80628ae-reloader\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.701221 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15804b0-0714-4384-ac0d-917338ef4104-metrics-certs\") pod \"controller-f8648f98b-ztkrp\" (UID: \"a15804b0-0714-4384-ac0d-917338ef4104\") " pod="metallb-system/controller-f8648f98b-ztkrp" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.701394 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnzwf\" (UniqueName: \"kubernetes.io/projected/2498f6fb-e1f7-481e-af17-1138c80628ae-kube-api-access-nnzwf\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.702798 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52a85b81-11c7-4a3f-9a1d-9ffe9edaa447-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-84dtl\" (UID: \"52a85b81-11c7-4a3f-9a1d-9ffe9edaa447\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.702091 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2498f6fb-e1f7-481e-af17-1138c80628ae-frr-sockets\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.702275 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2498f6fb-e1f7-481e-af17-1138c80628ae-reloader\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.701896 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2498f6fb-e1f7-481e-af17-1138c80628ae-metrics\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: E1205 20:25:29.702892 4744 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 05 20:25:29 crc kubenswrapper[4744]: E1205 20:25:29.703450 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52a85b81-11c7-4a3f-9a1d-9ffe9edaa447-cert podName:52a85b81-11c7-4a3f-9a1d-9ffe9edaa447 nodeName:}" failed. No retries permitted until 2025-12-05 20:25:30.203428621 +0000 UTC m=+900.433239989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52a85b81-11c7-4a3f-9a1d-9ffe9edaa447-cert") pod "frr-k8s-webhook-server-7fcb986d4-84dtl" (UID: "52a85b81-11c7-4a3f-9a1d-9ffe9edaa447") : secret "frr-k8s-webhook-server-cert" not found Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.703718 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a15804b0-0714-4384-ac0d-917338ef4104-cert\") pod \"controller-f8648f98b-ztkrp\" (UID: \"a15804b0-0714-4384-ac0d-917338ef4104\") " pod="metallb-system/controller-f8648f98b-ztkrp" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.703880 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2498f6fb-e1f7-481e-af17-1138c80628ae-metrics-certs\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.703983 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2498f6fb-e1f7-481e-af17-1138c80628ae-frr-conf\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.704091 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67mmf\" (UniqueName: \"kubernetes.io/projected/52a85b81-11c7-4a3f-9a1d-9ffe9edaa447-kube-api-access-67mmf\") pod \"frr-k8s-webhook-server-7fcb986d4-84dtl\" (UID: \"52a85b81-11c7-4a3f-9a1d-9ffe9edaa447\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.704180 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2t6t\" (UniqueName: \"kubernetes.io/projected/a15804b0-0714-4384-ac0d-917338ef4104-kube-api-access-c2t6t\") pod \"controller-f8648f98b-ztkrp\" (UID: \"a15804b0-0714-4384-ac0d-917338ef4104\") " pod="metallb-system/controller-f8648f98b-ztkrp" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.704269 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2498f6fb-e1f7-481e-af17-1138c80628ae-frr-startup\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: E1205 20:25:29.705287 4744 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 05 20:25:29 crc kubenswrapper[4744]: E1205 20:25:29.715607 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2498f6fb-e1f7-481e-af17-1138c80628ae-metrics-certs podName:2498f6fb-e1f7-481e-af17-1138c80628ae nodeName:}" failed. No retries permitted until 2025-12-05 20:25:30.215578927 +0000 UTC m=+900.445390305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2498f6fb-e1f7-481e-af17-1138c80628ae-metrics-certs") pod "frr-k8s-8gfzk" (UID: "2498f6fb-e1f7-481e-af17-1138c80628ae") : secret "frr-k8s-certs-secret" not found Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.705750 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2498f6fb-e1f7-481e-af17-1138c80628ae-frr-conf\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.716426 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2498f6fb-e1f7-481e-af17-1138c80628ae-frr-startup\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.727157 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnzwf\" (UniqueName: \"kubernetes.io/projected/2498f6fb-e1f7-481e-af17-1138c80628ae-kube-api-access-nnzwf\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.730066 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67mmf\" (UniqueName: \"kubernetes.io/projected/52a85b81-11c7-4a3f-9a1d-9ffe9edaa447-kube-api-access-67mmf\") pod \"frr-k8s-webhook-server-7fcb986d4-84dtl\" (UID: \"52a85b81-11c7-4a3f-9a1d-9ffe9edaa447\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.805790 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/934c31a1-c04b-42d9-be60-cdbf988913eb-metallb-excludel2\") pod \"speaker-qq6d7\" (UID: \"934c31a1-c04b-42d9-be60-cdbf988913eb\") " pod="metallb-system/speaker-qq6d7" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.805862 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/934c31a1-c04b-42d9-be60-cdbf988913eb-metrics-certs\") pod \"speaker-qq6d7\" (UID: \"934c31a1-c04b-42d9-be60-cdbf988913eb\") " pod="metallb-system/speaker-qq6d7" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.805909 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6vv4\" (UniqueName: \"kubernetes.io/projected/934c31a1-c04b-42d9-be60-cdbf988913eb-kube-api-access-b6vv4\") pod \"speaker-qq6d7\" (UID: \"934c31a1-c04b-42d9-be60-cdbf988913eb\") " pod="metallb-system/speaker-qq6d7" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.805958 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15804b0-0714-4384-ac0d-917338ef4104-metrics-certs\") pod \"controller-f8648f98b-ztkrp\" (UID: \"a15804b0-0714-4384-ac0d-917338ef4104\") " pod="metallb-system/controller-f8648f98b-ztkrp" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.806016 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a15804b0-0714-4384-ac0d-917338ef4104-cert\") pod \"controller-f8648f98b-ztkrp\" (UID: \"a15804b0-0714-4384-ac0d-917338ef4104\") " pod="metallb-system/controller-f8648f98b-ztkrp" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.806056 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2t6t\" (UniqueName: \"kubernetes.io/projected/a15804b0-0714-4384-ac0d-917338ef4104-kube-api-access-c2t6t\") pod \"controller-f8648f98b-ztkrp\" (UID: \"a15804b0-0714-4384-ac0d-917338ef4104\") " pod="metallb-system/controller-f8648f98b-ztkrp" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.806083 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/934c31a1-c04b-42d9-be60-cdbf988913eb-memberlist\") pod \"speaker-qq6d7\" (UID: \"934c31a1-c04b-42d9-be60-cdbf988913eb\") " pod="metallb-system/speaker-qq6d7" Dec 05 20:25:29 crc kubenswrapper[4744]: E1205 20:25:29.806229 4744 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 20:25:29 crc kubenswrapper[4744]: E1205 20:25:29.806379 4744 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 05 20:25:29 crc kubenswrapper[4744]: E1205 20:25:29.806390 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934c31a1-c04b-42d9-be60-cdbf988913eb-memberlist podName:934c31a1-c04b-42d9-be60-cdbf988913eb nodeName:}" failed. No retries permitted until 2025-12-05 20:25:30.306368879 +0000 UTC m=+900.536180247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/934c31a1-c04b-42d9-be60-cdbf988913eb-memberlist") pod "speaker-qq6d7" (UID: "934c31a1-c04b-42d9-be60-cdbf988913eb") : secret "metallb-memberlist" not found Dec 05 20:25:29 crc kubenswrapper[4744]: E1205 20:25:29.806437 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a15804b0-0714-4384-ac0d-917338ef4104-metrics-certs podName:a15804b0-0714-4384-ac0d-917338ef4104 nodeName:}" failed. No retries permitted until 2025-12-05 20:25:30.30641895 +0000 UTC m=+900.536230398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a15804b0-0714-4384-ac0d-917338ef4104-metrics-certs") pod "controller-f8648f98b-ztkrp" (UID: "a15804b0-0714-4384-ac0d-917338ef4104") : secret "controller-certs-secret" not found Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.806599 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/934c31a1-c04b-42d9-be60-cdbf988913eb-metallb-excludel2\") pod \"speaker-qq6d7\" (UID: \"934c31a1-c04b-42d9-be60-cdbf988913eb\") " pod="metallb-system/speaker-qq6d7" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.810708 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/934c31a1-c04b-42d9-be60-cdbf988913eb-metrics-certs\") pod \"speaker-qq6d7\" (UID: \"934c31a1-c04b-42d9-be60-cdbf988913eb\") " pod="metallb-system/speaker-qq6d7" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.813749 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a15804b0-0714-4384-ac0d-917338ef4104-cert\") pod \"controller-f8648f98b-ztkrp\" (UID: \"a15804b0-0714-4384-ac0d-917338ef4104\") " pod="metallb-system/controller-f8648f98b-ztkrp" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.822938 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6vv4\" (UniqueName: \"kubernetes.io/projected/934c31a1-c04b-42d9-be60-cdbf988913eb-kube-api-access-b6vv4\") pod \"speaker-qq6d7\" (UID: \"934c31a1-c04b-42d9-be60-cdbf988913eb\") " pod="metallb-system/speaker-qq6d7" Dec 05 20:25:29 crc kubenswrapper[4744]: I1205 20:25:29.826365 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2t6t\" (UniqueName: \"kubernetes.io/projected/a15804b0-0714-4384-ac0d-917338ef4104-kube-api-access-c2t6t\") pod \"controller-f8648f98b-ztkrp\" (UID: \"a15804b0-0714-4384-ac0d-917338ef4104\") " pod="metallb-system/controller-f8648f98b-ztkrp" Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.118212 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-767nv" event={"ID":"bc188947-7bde-445e-8638-e23eaec30a29","Type":"ContainerStarted","Data":"ddb771c5a143d479ebc8e7479e2ad92e659429a843cc5ef2e0ac557a74ca7280"} Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.138960 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-767nv" podStartSLOduration=2.727429496 podStartE2EDuration="4.138941383s" podCreationTimestamp="2025-12-05 20:25:26 +0000 UTC" firstStartedPulling="2025-12-05 20:25:28.104913911 +0000 UTC m=+898.334725289" lastFinishedPulling="2025-12-05 20:25:29.516425808 +0000 UTC m=+899.746237176" observedRunningTime="2025-12-05 20:25:30.136668926 +0000 UTC m=+900.366480304" watchObservedRunningTime="2025-12-05 20:25:30.138941383 +0000 UTC m=+900.368752761" Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.211976 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52a85b81-11c7-4a3f-9a1d-9ffe9edaa447-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-84dtl\" (UID: \"52a85b81-11c7-4a3f-9a1d-9ffe9edaa447\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl" Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.215396 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52a85b81-11c7-4a3f-9a1d-9ffe9edaa447-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-84dtl\" (UID: \"52a85b81-11c7-4a3f-9a1d-9ffe9edaa447\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl" Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.313692 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15804b0-0714-4384-ac0d-917338ef4104-metrics-certs\") pod \"controller-f8648f98b-ztkrp\" (UID: \"a15804b0-0714-4384-ac0d-917338ef4104\") " pod="metallb-system/controller-f8648f98b-ztkrp" Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.313787 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2498f6fb-e1f7-481e-af17-1138c80628ae-metrics-certs\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.313846 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/934c31a1-c04b-42d9-be60-cdbf988913eb-memberlist\") pod \"speaker-qq6d7\" (UID: \"934c31a1-c04b-42d9-be60-cdbf988913eb\") " pod="metallb-system/speaker-qq6d7" Dec 05 20:25:30 crc kubenswrapper[4744]: E1205 20:25:30.313989 4744 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 20:25:30 crc kubenswrapper[4744]: E1205 20:25:30.314061 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934c31a1-c04b-42d9-be60-cdbf988913eb-memberlist podName:934c31a1-c04b-42d9-be60-cdbf988913eb nodeName:}" failed. No retries permitted until 2025-12-05 20:25:31.314042136 +0000 UTC m=+901.543853524 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/934c31a1-c04b-42d9-be60-cdbf988913eb-memberlist") pod "speaker-qq6d7" (UID: "934c31a1-c04b-42d9-be60-cdbf988913eb") : secret "metallb-memberlist" not found Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.318342 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2498f6fb-e1f7-481e-af17-1138c80628ae-metrics-certs\") pod \"frr-k8s-8gfzk\" (UID: \"2498f6fb-e1f7-481e-af17-1138c80628ae\") " pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.318997 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15804b0-0714-4384-ac0d-917338ef4104-metrics-certs\") pod \"controller-f8648f98b-ztkrp\" (UID: \"a15804b0-0714-4384-ac0d-917338ef4104\") " pod="metallb-system/controller-f8648f98b-ztkrp" Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.378145 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-fclgd" Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.387649 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl" Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.396513 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.551116 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-ztkrp" Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.620722 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl"] Dec 05 20:25:30 crc kubenswrapper[4744]: W1205 20:25:30.633028 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52a85b81_11c7_4a3f_9a1d_9ffe9edaa447.slice/crio-f919276f3643659d25d5b5ab95aab6f954f308108f51ab8d5ae833c04e37d0e3 WatchSource:0}: Error finding container f919276f3643659d25d5b5ab95aab6f954f308108f51ab8d5ae833c04e37d0e3: Status 404 returned error can't find the container with id f919276f3643659d25d5b5ab95aab6f954f308108f51ab8d5ae833c04e37d0e3 Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.996344 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8gmkj"] Dec 05 20:25:30 crc kubenswrapper[4744]: I1205 20:25:30.998645 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.007063 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gmkj"] Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.044179 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-ztkrp"] Dec 05 20:25:31 crc kubenswrapper[4744]: W1205 20:25:31.058471 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda15804b0_0714_4384_ac0d_917338ef4104.slice/crio-69db575c52b767254a6c838ca5bedcb1891708840107d36f866ca553890dd41f WatchSource:0}: Error finding container 69db575c52b767254a6c838ca5bedcb1891708840107d36f866ca553890dd41f: Status 404 returned error can't find the container with id 69db575c52b767254a6c838ca5bedcb1891708840107d36f866ca553890dd41f Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.125550 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-756th\" (UniqueName: \"kubernetes.io/projected/be4261db-7205-4cad-8113-f6d7738be191-kube-api-access-756th\") pod \"redhat-marketplace-8gmkj\" (UID: \"be4261db-7205-4cad-8113-f6d7738be191\") " pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.125647 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be4261db-7205-4cad-8113-f6d7738be191-utilities\") pod \"redhat-marketplace-8gmkj\" (UID: \"be4261db-7205-4cad-8113-f6d7738be191\") " pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.125804 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be4261db-7205-4cad-8113-f6d7738be191-catalog-content\") pod \"redhat-marketplace-8gmkj\" (UID: \"be4261db-7205-4cad-8113-f6d7738be191\") " pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.126312 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8gfzk" event={"ID":"2498f6fb-e1f7-481e-af17-1138c80628ae","Type":"ContainerStarted","Data":"2ccff18f762b77e941f8d8a150c77a2bc75b3abde1baf641072fe5dc1a50dcd4"} Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.127806 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-ztkrp" event={"ID":"a15804b0-0714-4384-ac0d-917338ef4104","Type":"ContainerStarted","Data":"69db575c52b767254a6c838ca5bedcb1891708840107d36f866ca553890dd41f"} Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.129338 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl" event={"ID":"52a85b81-11c7-4a3f-9a1d-9ffe9edaa447","Type":"ContainerStarted","Data":"f919276f3643659d25d5b5ab95aab6f954f308108f51ab8d5ae833c04e37d0e3"} Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.227719 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be4261db-7205-4cad-8113-f6d7738be191-catalog-content\") pod \"redhat-marketplace-8gmkj\" (UID: \"be4261db-7205-4cad-8113-f6d7738be191\") " pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.227829 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-756th\" (UniqueName: \"kubernetes.io/projected/be4261db-7205-4cad-8113-f6d7738be191-kube-api-access-756th\") pod \"redhat-marketplace-8gmkj\" (UID: \"be4261db-7205-4cad-8113-f6d7738be191\") " pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.227896 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be4261db-7205-4cad-8113-f6d7738be191-utilities\") pod \"redhat-marketplace-8gmkj\" (UID: \"be4261db-7205-4cad-8113-f6d7738be191\") " pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.228706 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be4261db-7205-4cad-8113-f6d7738be191-utilities\") pod \"redhat-marketplace-8gmkj\" (UID: \"be4261db-7205-4cad-8113-f6d7738be191\") " pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.228741 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be4261db-7205-4cad-8113-f6d7738be191-catalog-content\") pod \"redhat-marketplace-8gmkj\" (UID: \"be4261db-7205-4cad-8113-f6d7738be191\") " pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.262083 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-756th\" (UniqueName: \"kubernetes.io/projected/be4261db-7205-4cad-8113-f6d7738be191-kube-api-access-756th\") pod \"redhat-marketplace-8gmkj\" (UID: \"be4261db-7205-4cad-8113-f6d7738be191\") " pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.317401 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.329586 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/934c31a1-c04b-42d9-be60-cdbf988913eb-memberlist\") pod \"speaker-qq6d7\" (UID: \"934c31a1-c04b-42d9-be60-cdbf988913eb\") " pod="metallb-system/speaker-qq6d7" Dec 05 20:25:31 crc kubenswrapper[4744]: E1205 20:25:31.329782 4744 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 20:25:31 crc kubenswrapper[4744]: E1205 20:25:31.329865 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934c31a1-c04b-42d9-be60-cdbf988913eb-memberlist podName:934c31a1-c04b-42d9-be60-cdbf988913eb nodeName:}" failed. No retries permitted until 2025-12-05 20:25:33.329842511 +0000 UTC m=+903.559653879 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/934c31a1-c04b-42d9-be60-cdbf988913eb-memberlist") pod "speaker-qq6d7" (UID: "934c31a1-c04b-42d9-be60-cdbf988913eb") : secret "metallb-memberlist" not found Dec 05 20:25:31 crc kubenswrapper[4744]: I1205 20:25:31.606688 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gmkj"] Dec 05 20:25:31 crc kubenswrapper[4744]: W1205 20:25:31.627934 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe4261db_7205_4cad_8113_f6d7738be191.slice/crio-ace7b27f7e34f25c5d899fddfa7b09572a1bfe2a33ad4c6ef43a26e1070ffa00 WatchSource:0}: Error finding container ace7b27f7e34f25c5d899fddfa7b09572a1bfe2a33ad4c6ef43a26e1070ffa00: Status 404 returned error can't find the container with id ace7b27f7e34f25c5d899fddfa7b09572a1bfe2a33ad4c6ef43a26e1070ffa00 Dec 05 20:25:32 crc kubenswrapper[4744]: I1205 20:25:32.140972 4744 generic.go:334] "Generic (PLEG): container finished" podID="be4261db-7205-4cad-8113-f6d7738be191" containerID="e719fab541fd7ea62b55578921f17da7b6fddbac9a37b3705ddfa136412d46f1" exitCode=0 Dec 05 20:25:32 crc kubenswrapper[4744]: I1205 20:25:32.141074 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gmkj" event={"ID":"be4261db-7205-4cad-8113-f6d7738be191","Type":"ContainerDied","Data":"e719fab541fd7ea62b55578921f17da7b6fddbac9a37b3705ddfa136412d46f1"} Dec 05 20:25:32 crc kubenswrapper[4744]: I1205 20:25:32.141308 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gmkj" event={"ID":"be4261db-7205-4cad-8113-f6d7738be191","Type":"ContainerStarted","Data":"ace7b27f7e34f25c5d899fddfa7b09572a1bfe2a33ad4c6ef43a26e1070ffa00"} Dec 05 20:25:32 crc kubenswrapper[4744]: I1205 20:25:32.144598 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-ztkrp" event={"ID":"a15804b0-0714-4384-ac0d-917338ef4104","Type":"ContainerStarted","Data":"d6b3d7ee0bbc7edba3a009a13425d4e4a8eb1cf44774a9089a96eaa375621fe2"} Dec 05 20:25:32 crc kubenswrapper[4744]: I1205 20:25:32.144630 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-ztkrp" event={"ID":"a15804b0-0714-4384-ac0d-917338ef4104","Type":"ContainerStarted","Data":"8abe30a069f8cb69ed00794b88c0c15dc71a76d6ebabbf9569f984df4ca7e119"} Dec 05 20:25:32 crc kubenswrapper[4744]: I1205 20:25:32.144753 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-ztkrp" Dec 05 20:25:32 crc kubenswrapper[4744]: I1205 20:25:32.181038 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-ztkrp" podStartSLOduration=3.181005456 podStartE2EDuration="3.181005456s" podCreationTimestamp="2025-12-05 20:25:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:25:32.180649448 +0000 UTC m=+902.410460826" watchObservedRunningTime="2025-12-05 20:25:32.181005456 +0000 UTC m=+902.410816824" Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.169573 4744 generic.go:334] "Generic (PLEG): container finished" podID="be4261db-7205-4cad-8113-f6d7738be191" containerID="3a64dae57ece69ad44654f81ecde095d008ce3003eada549d1668ebdff55f696" exitCode=0 Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.169645 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gmkj" event={"ID":"be4261db-7205-4cad-8113-f6d7738be191","Type":"ContainerDied","Data":"3a64dae57ece69ad44654f81ecde095d008ce3003eada549d1668ebdff55f696"} Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.358695 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/934c31a1-c04b-42d9-be60-cdbf988913eb-memberlist\") pod \"speaker-qq6d7\" (UID: \"934c31a1-c04b-42d9-be60-cdbf988913eb\") " pod="metallb-system/speaker-qq6d7" Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.366038 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/934c31a1-c04b-42d9-be60-cdbf988913eb-memberlist\") pod \"speaker-qq6d7\" (UID: \"934c31a1-c04b-42d9-be60-cdbf988913eb\") " pod="metallb-system/speaker-qq6d7" Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.414635 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rvfs6"] Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.416117 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.450762 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rvfs6"] Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.523961 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-rmxmm" Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.532400 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qq6d7" Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.561924 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09778a24-7f9d-4e5f-8114-398ec482ccbd-catalog-content\") pod \"community-operators-rvfs6\" (UID: \"09778a24-7f9d-4e5f-8114-398ec482ccbd\") " pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.562038 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09778a24-7f9d-4e5f-8114-398ec482ccbd-utilities\") pod \"community-operators-rvfs6\" (UID: \"09778a24-7f9d-4e5f-8114-398ec482ccbd\") " pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.562075 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7st5j\" (UniqueName: \"kubernetes.io/projected/09778a24-7f9d-4e5f-8114-398ec482ccbd-kube-api-access-7st5j\") pod \"community-operators-rvfs6\" (UID: \"09778a24-7f9d-4e5f-8114-398ec482ccbd\") " pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:33 crc kubenswrapper[4744]: W1205 20:25:33.586374 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod934c31a1_c04b_42d9_be60_cdbf988913eb.slice/crio-a5f1b2d7296b94beea27bebf2ed27f31e3e1150e31b8079cc76c7442abae0502 WatchSource:0}: Error finding container a5f1b2d7296b94beea27bebf2ed27f31e3e1150e31b8079cc76c7442abae0502: Status 404 returned error can't find the container with id a5f1b2d7296b94beea27bebf2ed27f31e3e1150e31b8079cc76c7442abae0502 Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.662817 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09778a24-7f9d-4e5f-8114-398ec482ccbd-utilities\") pod \"community-operators-rvfs6\" (UID: \"09778a24-7f9d-4e5f-8114-398ec482ccbd\") " pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.662873 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7st5j\" (UniqueName: \"kubernetes.io/projected/09778a24-7f9d-4e5f-8114-398ec482ccbd-kube-api-access-7st5j\") pod \"community-operators-rvfs6\" (UID: \"09778a24-7f9d-4e5f-8114-398ec482ccbd\") " pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.662914 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09778a24-7f9d-4e5f-8114-398ec482ccbd-catalog-content\") pod \"community-operators-rvfs6\" (UID: \"09778a24-7f9d-4e5f-8114-398ec482ccbd\") " pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.663318 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09778a24-7f9d-4e5f-8114-398ec482ccbd-catalog-content\") pod \"community-operators-rvfs6\" (UID: \"09778a24-7f9d-4e5f-8114-398ec482ccbd\") " pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.664708 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09778a24-7f9d-4e5f-8114-398ec482ccbd-utilities\") pod \"community-operators-rvfs6\" (UID: \"09778a24-7f9d-4e5f-8114-398ec482ccbd\") " pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.684026 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7st5j\" (UniqueName: \"kubernetes.io/projected/09778a24-7f9d-4e5f-8114-398ec482ccbd-kube-api-access-7st5j\") pod \"community-operators-rvfs6\" (UID: \"09778a24-7f9d-4e5f-8114-398ec482ccbd\") " pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:33 crc kubenswrapper[4744]: I1205 20:25:33.761118 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:34 crc kubenswrapper[4744]: I1205 20:25:34.077983 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rvfs6"] Dec 05 20:25:34 crc kubenswrapper[4744]: I1205 20:25:34.202121 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gmkj" event={"ID":"be4261db-7205-4cad-8113-f6d7738be191","Type":"ContainerStarted","Data":"dc3cd04b304a17bd1799fb3f56c6bbe465177096a441c7cbfb7d3305a039fe28"} Dec 05 20:25:34 crc kubenswrapper[4744]: I1205 20:25:34.206966 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qq6d7" event={"ID":"934c31a1-c04b-42d9-be60-cdbf988913eb","Type":"ContainerStarted","Data":"ddcd47a8dad42f1b667f37394f312c7c31227467814ed0d3738895617dfd0935"} Dec 05 20:25:34 crc kubenswrapper[4744]: I1205 20:25:34.207040 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qq6d7" event={"ID":"934c31a1-c04b-42d9-be60-cdbf988913eb","Type":"ContainerStarted","Data":"a5f1b2d7296b94beea27bebf2ed27f31e3e1150e31b8079cc76c7442abae0502"} Dec 05 20:25:34 crc kubenswrapper[4744]: I1205 20:25:34.208546 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvfs6" event={"ID":"09778a24-7f9d-4e5f-8114-398ec482ccbd","Type":"ContainerStarted","Data":"4ae098aa9217fcb5cca1d8b60315034ffdc8777a993f540d77f7b166eb2b21c3"} Dec 05 20:25:34 crc kubenswrapper[4744]: I1205 20:25:34.223923 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8gmkj" podStartSLOduration=2.619500334 podStartE2EDuration="4.223903211s" podCreationTimestamp="2025-12-05 20:25:30 +0000 UTC" firstStartedPulling="2025-12-05 20:25:32.142789086 +0000 UTC m=+902.372600454" lastFinishedPulling="2025-12-05 20:25:33.747191963 +0000 UTC m=+903.977003331" observedRunningTime="2025-12-05 20:25:34.220299011 +0000 UTC m=+904.450110389" watchObservedRunningTime="2025-12-05 20:25:34.223903211 +0000 UTC m=+904.453714579" Dec 05 20:25:35 crc kubenswrapper[4744]: I1205 20:25:35.223058 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvfs6" event={"ID":"09778a24-7f9d-4e5f-8114-398ec482ccbd","Type":"ContainerStarted","Data":"9b8bf0c93511742190b53e3231340ebb1eedfdd1969fed9d78e88753b4148b37"} Dec 05 20:25:36 crc kubenswrapper[4744]: I1205 20:25:36.229727 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qq6d7" event={"ID":"934c31a1-c04b-42d9-be60-cdbf988913eb","Type":"ContainerStarted","Data":"f83af317f33fa42c25b1dd6973fe7ab6122dbb840a3def0a35613b575723f89c"} Dec 05 20:25:36 crc kubenswrapper[4744]: I1205 20:25:36.230660 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-qq6d7" Dec 05 20:25:36 crc kubenswrapper[4744]: I1205 20:25:36.232166 4744 generic.go:334] "Generic (PLEG): container finished" podID="09778a24-7f9d-4e5f-8114-398ec482ccbd" containerID="9b8bf0c93511742190b53e3231340ebb1eedfdd1969fed9d78e88753b4148b37" exitCode=0 Dec 05 20:25:36 crc kubenswrapper[4744]: I1205 20:25:36.232193 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvfs6" event={"ID":"09778a24-7f9d-4e5f-8114-398ec482ccbd","Type":"ContainerDied","Data":"9b8bf0c93511742190b53e3231340ebb1eedfdd1969fed9d78e88753b4148b37"} Dec 05 20:25:36 crc kubenswrapper[4744]: I1205 20:25:36.262095 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-qq6d7" podStartSLOduration=7.262065197 podStartE2EDuration="7.262065197s" podCreationTimestamp="2025-12-05 20:25:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:25:36.246212389 +0000 UTC m=+906.476023757" watchObservedRunningTime="2025-12-05 20:25:36.262065197 +0000 UTC m=+906.491876555" Dec 05 20:25:36 crc kubenswrapper[4744]: I1205 20:25:36.737557 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:36 crc kubenswrapper[4744]: I1205 20:25:36.737738 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:36 crc kubenswrapper[4744]: I1205 20:25:36.782477 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:37 crc kubenswrapper[4744]: I1205 20:25:37.276887 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:39 crc kubenswrapper[4744]: I1205 20:25:39.184373 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-767nv"] Dec 05 20:25:39 crc kubenswrapper[4744]: I1205 20:25:39.250626 4744 generic.go:334] "Generic (PLEG): container finished" podID="2498f6fb-e1f7-481e-af17-1138c80628ae" containerID="e673726559f637303001248a61ec761cf560f32db04b34482554f6dc81b3da55" exitCode=0 Dec 05 20:25:39 crc kubenswrapper[4744]: I1205 20:25:39.250689 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8gfzk" event={"ID":"2498f6fb-e1f7-481e-af17-1138c80628ae","Type":"ContainerDied","Data":"e673726559f637303001248a61ec761cf560f32db04b34482554f6dc81b3da55"} Dec 05 20:25:39 crc kubenswrapper[4744]: I1205 20:25:39.252903 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl" event={"ID":"52a85b81-11c7-4a3f-9a1d-9ffe9edaa447","Type":"ContainerStarted","Data":"be814545228e1c9b70e713c3ce5282f412eda5f6bbf33bc67910a5b6c4944f18"} Dec 05 20:25:39 crc kubenswrapper[4744]: I1205 20:25:39.253057 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl" Dec 05 20:25:39 crc kubenswrapper[4744]: I1205 20:25:39.299897 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl" podStartSLOduration=2.058377122 podStartE2EDuration="10.299875955s" podCreationTimestamp="2025-12-05 20:25:29 +0000 UTC" firstStartedPulling="2025-12-05 20:25:30.635001298 +0000 UTC m=+900.864812666" lastFinishedPulling="2025-12-05 20:25:38.876500131 +0000 UTC m=+909.106311499" observedRunningTime="2025-12-05 20:25:39.29518648 +0000 UTC m=+909.524997858" watchObservedRunningTime="2025-12-05 20:25:39.299875955 +0000 UTC m=+909.529687323" Dec 05 20:25:40 crc kubenswrapper[4744]: I1205 20:25:40.259590 4744 generic.go:334] "Generic (PLEG): container finished" podID="2498f6fb-e1f7-481e-af17-1138c80628ae" containerID="96dcd2ae4fbba0cedb2cb60315f1ae869378e2f6ee290f28e9ccaffb7fe8644f" exitCode=0 Dec 05 20:25:40 crc kubenswrapper[4744]: I1205 20:25:40.259654 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8gfzk" event={"ID":"2498f6fb-e1f7-481e-af17-1138c80628ae","Type":"ContainerDied","Data":"96dcd2ae4fbba0cedb2cb60315f1ae869378e2f6ee290f28e9ccaffb7fe8644f"} Dec 05 20:25:40 crc kubenswrapper[4744]: I1205 20:25:40.261866 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvfs6" event={"ID":"09778a24-7f9d-4e5f-8114-398ec482ccbd","Type":"ContainerStarted","Data":"c1dde70858f7891ac64aa97ade3f9fde1294884db85d49f807f53b7526ddecaa"} Dec 05 20:25:40 crc kubenswrapper[4744]: I1205 20:25:40.261984 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-767nv" podUID="bc188947-7bde-445e-8638-e23eaec30a29" containerName="registry-server" containerID="cri-o://ddb771c5a143d479ebc8e7479e2ad92e659429a843cc5ef2e0ac557a74ca7280" gracePeriod=2 Dec 05 20:25:40 crc kubenswrapper[4744]: I1205 20:25:40.628958 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:40 crc kubenswrapper[4744]: I1205 20:25:40.771712 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc188947-7bde-445e-8638-e23eaec30a29-utilities\") pod \"bc188947-7bde-445e-8638-e23eaec30a29\" (UID: \"bc188947-7bde-445e-8638-e23eaec30a29\") " Dec 05 20:25:40 crc kubenswrapper[4744]: I1205 20:25:40.771783 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6th8h\" (UniqueName: \"kubernetes.io/projected/bc188947-7bde-445e-8638-e23eaec30a29-kube-api-access-6th8h\") pod \"bc188947-7bde-445e-8638-e23eaec30a29\" (UID: \"bc188947-7bde-445e-8638-e23eaec30a29\") " Dec 05 20:25:40 crc kubenswrapper[4744]: I1205 20:25:40.771852 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc188947-7bde-445e-8638-e23eaec30a29-catalog-content\") pod \"bc188947-7bde-445e-8638-e23eaec30a29\" (UID: \"bc188947-7bde-445e-8638-e23eaec30a29\") " Dec 05 20:25:40 crc kubenswrapper[4744]: I1205 20:25:40.772583 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc188947-7bde-445e-8638-e23eaec30a29-utilities" (OuterVolumeSpecName: "utilities") pod "bc188947-7bde-445e-8638-e23eaec30a29" (UID: "bc188947-7bde-445e-8638-e23eaec30a29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:40 crc kubenswrapper[4744]: I1205 20:25:40.777926 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc188947-7bde-445e-8638-e23eaec30a29-kube-api-access-6th8h" (OuterVolumeSpecName: "kube-api-access-6th8h") pod "bc188947-7bde-445e-8638-e23eaec30a29" (UID: "bc188947-7bde-445e-8638-e23eaec30a29"). InnerVolumeSpecName "kube-api-access-6th8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:40 crc kubenswrapper[4744]: I1205 20:25:40.820486 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc188947-7bde-445e-8638-e23eaec30a29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc188947-7bde-445e-8638-e23eaec30a29" (UID: "bc188947-7bde-445e-8638-e23eaec30a29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:40 crc kubenswrapper[4744]: I1205 20:25:40.873693 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc188947-7bde-445e-8638-e23eaec30a29-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:40 crc kubenswrapper[4744]: I1205 20:25:40.873765 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6th8h\" (UniqueName: \"kubernetes.io/projected/bc188947-7bde-445e-8638-e23eaec30a29-kube-api-access-6th8h\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:40 crc kubenswrapper[4744]: I1205 20:25:40.873780 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc188947-7bde-445e-8638-e23eaec30a29-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.267988 4744 generic.go:334] "Generic (PLEG): container finished" podID="09778a24-7f9d-4e5f-8114-398ec482ccbd" containerID="c1dde70858f7891ac64aa97ade3f9fde1294884db85d49f807f53b7526ddecaa" exitCode=0 Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.268051 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvfs6" event={"ID":"09778a24-7f9d-4e5f-8114-398ec482ccbd","Type":"ContainerDied","Data":"c1dde70858f7891ac64aa97ade3f9fde1294884db85d49f807f53b7526ddecaa"} Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.271989 4744 generic.go:334] "Generic (PLEG): container finished" podID="2498f6fb-e1f7-481e-af17-1138c80628ae" containerID="bdb61312080da1ee5004d75d41a703b6669a33e7ec813d92053221c8ac6a58af" exitCode=0 Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.272085 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8gfzk" event={"ID":"2498f6fb-e1f7-481e-af17-1138c80628ae","Type":"ContainerDied","Data":"bdb61312080da1ee5004d75d41a703b6669a33e7ec813d92053221c8ac6a58af"} Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.275524 4744 generic.go:334] "Generic (PLEG): container finished" podID="bc188947-7bde-445e-8638-e23eaec30a29" containerID="ddb771c5a143d479ebc8e7479e2ad92e659429a843cc5ef2e0ac557a74ca7280" exitCode=0 Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.275571 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-767nv" event={"ID":"bc188947-7bde-445e-8638-e23eaec30a29","Type":"ContainerDied","Data":"ddb771c5a143d479ebc8e7479e2ad92e659429a843cc5ef2e0ac557a74ca7280"} Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.275590 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-767nv" Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.275612 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-767nv" event={"ID":"bc188947-7bde-445e-8638-e23eaec30a29","Type":"ContainerDied","Data":"18dde0b9ff5f6486955e5fd2e69a90a73e1b45f27b8af77e54a4aa706de78f99"} Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.275637 4744 scope.go:117] "RemoveContainer" containerID="ddb771c5a143d479ebc8e7479e2ad92e659429a843cc5ef2e0ac557a74ca7280" Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.295634 4744 scope.go:117] "RemoveContainer" containerID="6cc340403bbefcb6f00372b798b9fffd870513f10b9ba128952b0e5af0c1b8be" Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.320219 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.320520 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.339816 4744 scope.go:117] "RemoveContainer" containerID="26ca2b8ca676b2224d338fcbe397e40b2291952b50a67f3ee7f459aa1b16e297" Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.342203 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-767nv"] Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.345491 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-767nv"] Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.388232 4744 scope.go:117] "RemoveContainer" containerID="ddb771c5a143d479ebc8e7479e2ad92e659429a843cc5ef2e0ac557a74ca7280" Dec 05 20:25:41 crc kubenswrapper[4744]: E1205 20:25:41.388664 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddb771c5a143d479ebc8e7479e2ad92e659429a843cc5ef2e0ac557a74ca7280\": container with ID starting with ddb771c5a143d479ebc8e7479e2ad92e659429a843cc5ef2e0ac557a74ca7280 not found: ID does not exist" containerID="ddb771c5a143d479ebc8e7479e2ad92e659429a843cc5ef2e0ac557a74ca7280" Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.388699 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb771c5a143d479ebc8e7479e2ad92e659429a843cc5ef2e0ac557a74ca7280"} err="failed to get container status \"ddb771c5a143d479ebc8e7479e2ad92e659429a843cc5ef2e0ac557a74ca7280\": rpc error: code = NotFound desc = could not find container \"ddb771c5a143d479ebc8e7479e2ad92e659429a843cc5ef2e0ac557a74ca7280\": container with ID starting with ddb771c5a143d479ebc8e7479e2ad92e659429a843cc5ef2e0ac557a74ca7280 not found: ID does not exist" Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.388724 4744 scope.go:117] "RemoveContainer" containerID="6cc340403bbefcb6f00372b798b9fffd870513f10b9ba128952b0e5af0c1b8be" Dec 05 20:25:41 crc kubenswrapper[4744]: E1205 20:25:41.389008 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cc340403bbefcb6f00372b798b9fffd870513f10b9ba128952b0e5af0c1b8be\": container with ID starting with 6cc340403bbefcb6f00372b798b9fffd870513f10b9ba128952b0e5af0c1b8be not found: ID does not exist" containerID="6cc340403bbefcb6f00372b798b9fffd870513f10b9ba128952b0e5af0c1b8be" Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.389043 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cc340403bbefcb6f00372b798b9fffd870513f10b9ba128952b0e5af0c1b8be"} err="failed to get container status \"6cc340403bbefcb6f00372b798b9fffd870513f10b9ba128952b0e5af0c1b8be\": rpc error: code = NotFound desc = could not find container \"6cc340403bbefcb6f00372b798b9fffd870513f10b9ba128952b0e5af0c1b8be\": container with ID starting with 6cc340403bbefcb6f00372b798b9fffd870513f10b9ba128952b0e5af0c1b8be not found: ID does not exist" Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.389071 4744 scope.go:117] "RemoveContainer" containerID="26ca2b8ca676b2224d338fcbe397e40b2291952b50a67f3ee7f459aa1b16e297" Dec 05 20:25:41 crc kubenswrapper[4744]: E1205 20:25:41.389317 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26ca2b8ca676b2224d338fcbe397e40b2291952b50a67f3ee7f459aa1b16e297\": container with ID starting with 26ca2b8ca676b2224d338fcbe397e40b2291952b50a67f3ee7f459aa1b16e297 not found: ID does not exist" containerID="26ca2b8ca676b2224d338fcbe397e40b2291952b50a67f3ee7f459aa1b16e297" Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.389338 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26ca2b8ca676b2224d338fcbe397e40b2291952b50a67f3ee7f459aa1b16e297"} err="failed to get container status \"26ca2b8ca676b2224d338fcbe397e40b2291952b50a67f3ee7f459aa1b16e297\": rpc error: code = NotFound desc = could not find container \"26ca2b8ca676b2224d338fcbe397e40b2291952b50a67f3ee7f459aa1b16e297\": container with ID starting with 26ca2b8ca676b2224d338fcbe397e40b2291952b50a67f3ee7f459aa1b16e297 not found: ID does not exist" Dec 05 20:25:41 crc kubenswrapper[4744]: I1205 20:25:41.410729 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:42 crc kubenswrapper[4744]: I1205 20:25:42.087636 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc188947-7bde-445e-8638-e23eaec30a29" path="/var/lib/kubelet/pods/bc188947-7bde-445e-8638-e23eaec30a29/volumes" Dec 05 20:25:42 crc kubenswrapper[4744]: I1205 20:25:42.283310 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvfs6" event={"ID":"09778a24-7f9d-4e5f-8114-398ec482ccbd","Type":"ContainerStarted","Data":"29ac81aa3b3f1c19561bb1bd81656ab13bc9fcfa6850994d8f4297c5c59bef44"} Dec 05 20:25:42 crc kubenswrapper[4744]: I1205 20:25:42.288396 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8gfzk" event={"ID":"2498f6fb-e1f7-481e-af17-1138c80628ae","Type":"ContainerStarted","Data":"155c109b990d2fb37dd6600d4a8bbd9cf50197f061992b2c3cd84ec1c434f005"} Dec 05 20:25:42 crc kubenswrapper[4744]: I1205 20:25:42.288473 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8gfzk" event={"ID":"2498f6fb-e1f7-481e-af17-1138c80628ae","Type":"ContainerStarted","Data":"3f862fe74534e2104713038f729d06f38ceeb79735c58d700eddd39aadeeefef"} Dec 05 20:25:42 crc kubenswrapper[4744]: I1205 20:25:42.288502 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8gfzk" event={"ID":"2498f6fb-e1f7-481e-af17-1138c80628ae","Type":"ContainerStarted","Data":"edbef3e6c62cacc963d59dcd31ee8383c3f08abd472b1c4b5530c34cde36fe85"} Dec 05 20:25:42 crc kubenswrapper[4744]: I1205 20:25:42.288539 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:42 crc kubenswrapper[4744]: I1205 20:25:42.288564 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8gfzk" event={"ID":"2498f6fb-e1f7-481e-af17-1138c80628ae","Type":"ContainerStarted","Data":"5e1b5a23bacf649bf5e2580625ef8f4c3006c23478e0278c029cda90b1e0cdc7"} Dec 05 20:25:42 crc kubenswrapper[4744]: I1205 20:25:42.288587 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8gfzk" event={"ID":"2498f6fb-e1f7-481e-af17-1138c80628ae","Type":"ContainerStarted","Data":"3087a1232ef0acd4a4ada51e26b08c60980a728f7d70eb7085865ce9b9c7c926"} Dec 05 20:25:42 crc kubenswrapper[4744]: I1205 20:25:42.288606 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8gfzk" event={"ID":"2498f6fb-e1f7-481e-af17-1138c80628ae","Type":"ContainerStarted","Data":"3a3cb113371489e69f6c927a85461e2e0d73316df667ecb73a5a46695382ea1a"} Dec 05 20:25:42 crc kubenswrapper[4744]: I1205 20:25:42.311483 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rvfs6" podStartSLOduration=6.399316727 podStartE2EDuration="9.311470251s" podCreationTimestamp="2025-12-05 20:25:33 +0000 UTC" firstStartedPulling="2025-12-05 20:25:38.757756691 +0000 UTC m=+908.987568089" lastFinishedPulling="2025-12-05 20:25:41.669910245 +0000 UTC m=+911.899721613" observedRunningTime="2025-12-05 20:25:42.311020151 +0000 UTC m=+912.540831519" watchObservedRunningTime="2025-12-05 20:25:42.311470251 +0000 UTC m=+912.541281619" Dec 05 20:25:42 crc kubenswrapper[4744]: I1205 20:25:42.341318 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-8gfzk" podStartSLOduration=5.019577652 podStartE2EDuration="13.341278044s" podCreationTimestamp="2025-12-05 20:25:29 +0000 UTC" firstStartedPulling="2025-12-05 20:25:30.584858607 +0000 UTC m=+900.814669975" lastFinishedPulling="2025-12-05 20:25:38.906558999 +0000 UTC m=+909.136370367" observedRunningTime="2025-12-05 20:25:42.332373106 +0000 UTC m=+912.562184474" watchObservedRunningTime="2025-12-05 20:25:42.341278044 +0000 UTC m=+912.571089432" Dec 05 20:25:42 crc kubenswrapper[4744]: I1205 20:25:42.354642 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:43 crc kubenswrapper[4744]: I1205 20:25:43.761857 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:43 crc kubenswrapper[4744]: I1205 20:25:43.761927 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:43 crc kubenswrapper[4744]: I1205 20:25:43.809960 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:44 crc kubenswrapper[4744]: I1205 20:25:44.581822 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gmkj"] Dec 05 20:25:45 crc kubenswrapper[4744]: I1205 20:25:45.307383 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8gmkj" podUID="be4261db-7205-4cad-8113-f6d7738be191" containerName="registry-server" containerID="cri-o://dc3cd04b304a17bd1799fb3f56c6bbe465177096a441c7cbfb7d3305a039fe28" gracePeriod=2 Dec 05 20:25:45 crc kubenswrapper[4744]: I1205 20:25:45.397152 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:45 crc kubenswrapper[4744]: I1205 20:25:45.444701 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:25:46 crc kubenswrapper[4744]: I1205 20:25:46.316804 4744 generic.go:334] "Generic (PLEG): container finished" podID="be4261db-7205-4cad-8113-f6d7738be191" containerID="dc3cd04b304a17bd1799fb3f56c6bbe465177096a441c7cbfb7d3305a039fe28" exitCode=0 Dec 05 20:25:46 crc kubenswrapper[4744]: I1205 20:25:46.316880 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gmkj" event={"ID":"be4261db-7205-4cad-8113-f6d7738be191","Type":"ContainerDied","Data":"dc3cd04b304a17bd1799fb3f56c6bbe465177096a441c7cbfb7d3305a039fe28"} Dec 05 20:25:46 crc kubenswrapper[4744]: I1205 20:25:46.397652 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:46 crc kubenswrapper[4744]: I1205 20:25:46.575765 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-756th\" (UniqueName: \"kubernetes.io/projected/be4261db-7205-4cad-8113-f6d7738be191-kube-api-access-756th\") pod \"be4261db-7205-4cad-8113-f6d7738be191\" (UID: \"be4261db-7205-4cad-8113-f6d7738be191\") " Dec 05 20:25:46 crc kubenswrapper[4744]: I1205 20:25:46.575838 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be4261db-7205-4cad-8113-f6d7738be191-catalog-content\") pod \"be4261db-7205-4cad-8113-f6d7738be191\" (UID: \"be4261db-7205-4cad-8113-f6d7738be191\") " Dec 05 20:25:46 crc kubenswrapper[4744]: I1205 20:25:46.575913 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be4261db-7205-4cad-8113-f6d7738be191-utilities\") pod \"be4261db-7205-4cad-8113-f6d7738be191\" (UID: \"be4261db-7205-4cad-8113-f6d7738be191\") " Dec 05 20:25:46 crc kubenswrapper[4744]: I1205 20:25:46.576995 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be4261db-7205-4cad-8113-f6d7738be191-utilities" (OuterVolumeSpecName: "utilities") pod "be4261db-7205-4cad-8113-f6d7738be191" (UID: "be4261db-7205-4cad-8113-f6d7738be191"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:46 crc kubenswrapper[4744]: I1205 20:25:46.581388 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be4261db-7205-4cad-8113-f6d7738be191-kube-api-access-756th" (OuterVolumeSpecName: "kube-api-access-756th") pod "be4261db-7205-4cad-8113-f6d7738be191" (UID: "be4261db-7205-4cad-8113-f6d7738be191"). InnerVolumeSpecName "kube-api-access-756th". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:46 crc kubenswrapper[4744]: I1205 20:25:46.594134 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be4261db-7205-4cad-8113-f6d7738be191-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be4261db-7205-4cad-8113-f6d7738be191" (UID: "be4261db-7205-4cad-8113-f6d7738be191"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:46 crc kubenswrapper[4744]: I1205 20:25:46.677868 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be4261db-7205-4cad-8113-f6d7738be191-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:46 crc kubenswrapper[4744]: I1205 20:25:46.677921 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-756th\" (UniqueName: \"kubernetes.io/projected/be4261db-7205-4cad-8113-f6d7738be191-kube-api-access-756th\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:46 crc kubenswrapper[4744]: I1205 20:25:46.677939 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be4261db-7205-4cad-8113-f6d7738be191-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:47 crc kubenswrapper[4744]: I1205 20:25:47.329650 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gmkj" event={"ID":"be4261db-7205-4cad-8113-f6d7738be191","Type":"ContainerDied","Data":"ace7b27f7e34f25c5d899fddfa7b09572a1bfe2a33ad4c6ef43a26e1070ffa00"} Dec 05 20:25:47 crc kubenswrapper[4744]: I1205 20:25:47.329751 4744 scope.go:117] "RemoveContainer" containerID="dc3cd04b304a17bd1799fb3f56c6bbe465177096a441c7cbfb7d3305a039fe28" Dec 05 20:25:47 crc kubenswrapper[4744]: I1205 20:25:47.329801 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gmkj" Dec 05 20:25:47 crc kubenswrapper[4744]: I1205 20:25:47.350929 4744 scope.go:117] "RemoveContainer" containerID="3a64dae57ece69ad44654f81ecde095d008ce3003eada549d1668ebdff55f696" Dec 05 20:25:47 crc kubenswrapper[4744]: I1205 20:25:47.373603 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gmkj"] Dec 05 20:25:47 crc kubenswrapper[4744]: I1205 20:25:47.381893 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gmkj"] Dec 05 20:25:47 crc kubenswrapper[4744]: I1205 20:25:47.392445 4744 scope.go:117] "RemoveContainer" containerID="e719fab541fd7ea62b55578921f17da7b6fddbac9a37b3705ddfa136412d46f1" Dec 05 20:25:48 crc kubenswrapper[4744]: I1205 20:25:48.090355 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be4261db-7205-4cad-8113-f6d7738be191" path="/var/lib/kubelet/pods/be4261db-7205-4cad-8113-f6d7738be191/volumes" Dec 05 20:25:50 crc kubenswrapper[4744]: I1205 20:25:50.391191 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-84dtl" Dec 05 20:25:50 crc kubenswrapper[4744]: I1205 20:25:50.556654 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-ztkrp" Dec 05 20:25:53 crc kubenswrapper[4744]: I1205 20:25:53.537152 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-qq6d7" Dec 05 20:25:54 crc kubenswrapper[4744]: I1205 20:25:54.001219 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:54 crc kubenswrapper[4744]: I1205 20:25:54.048553 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rvfs6"] Dec 05 20:25:54 crc kubenswrapper[4744]: I1205 20:25:54.380438 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rvfs6" podUID="09778a24-7f9d-4e5f-8114-398ec482ccbd" containerName="registry-server" containerID="cri-o://29ac81aa3b3f1c19561bb1bd81656ab13bc9fcfa6850994d8f4297c5c59bef44" gracePeriod=2 Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.315729 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.319075 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd"] Dec 05 20:25:55 crc kubenswrapper[4744]: E1205 20:25:55.319371 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09778a24-7f9d-4e5f-8114-398ec482ccbd" containerName="extract-content" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.319392 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="09778a24-7f9d-4e5f-8114-398ec482ccbd" containerName="extract-content" Dec 05 20:25:55 crc kubenswrapper[4744]: E1205 20:25:55.319407 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4261db-7205-4cad-8113-f6d7738be191" containerName="extract-content" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.319415 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4261db-7205-4cad-8113-f6d7738be191" containerName="extract-content" Dec 05 20:25:55 crc kubenswrapper[4744]: E1205 20:25:55.319429 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc188947-7bde-445e-8638-e23eaec30a29" containerName="registry-server" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.319436 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc188947-7bde-445e-8638-e23eaec30a29" containerName="registry-server" Dec 05 20:25:55 crc kubenswrapper[4744]: E1205 20:25:55.319447 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4261db-7205-4cad-8113-f6d7738be191" containerName="registry-server" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.319454 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4261db-7205-4cad-8113-f6d7738be191" containerName="registry-server" Dec 05 20:25:55 crc kubenswrapper[4744]: E1205 20:25:55.319467 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09778a24-7f9d-4e5f-8114-398ec482ccbd" containerName="extract-utilities" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.319476 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="09778a24-7f9d-4e5f-8114-398ec482ccbd" containerName="extract-utilities" Dec 05 20:25:55 crc kubenswrapper[4744]: E1205 20:25:55.319495 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4261db-7205-4cad-8113-f6d7738be191" containerName="extract-utilities" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.319503 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4261db-7205-4cad-8113-f6d7738be191" containerName="extract-utilities" Dec 05 20:25:55 crc kubenswrapper[4744]: E1205 20:25:55.319512 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09778a24-7f9d-4e5f-8114-398ec482ccbd" containerName="registry-server" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.319519 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="09778a24-7f9d-4e5f-8114-398ec482ccbd" containerName="registry-server" Dec 05 20:25:55 crc kubenswrapper[4744]: E1205 20:25:55.319531 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc188947-7bde-445e-8638-e23eaec30a29" containerName="extract-content" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.319538 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc188947-7bde-445e-8638-e23eaec30a29" containerName="extract-content" Dec 05 20:25:55 crc kubenswrapper[4744]: E1205 20:25:55.319552 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc188947-7bde-445e-8638-e23eaec30a29" containerName="extract-utilities" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.319560 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc188947-7bde-445e-8638-e23eaec30a29" containerName="extract-utilities" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.319691 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc188947-7bde-445e-8638-e23eaec30a29" containerName="registry-server" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.319705 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="09778a24-7f9d-4e5f-8114-398ec482ccbd" containerName="registry-server" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.319727 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="be4261db-7205-4cad-8113-f6d7738be191" containerName="registry-server" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.320727 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.333040 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.355390 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd"] Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.391577 4744 generic.go:334] "Generic (PLEG): container finished" podID="09778a24-7f9d-4e5f-8114-398ec482ccbd" containerID="29ac81aa3b3f1c19561bb1bd81656ab13bc9fcfa6850994d8f4297c5c59bef44" exitCode=0 Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.391626 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvfs6" event={"ID":"09778a24-7f9d-4e5f-8114-398ec482ccbd","Type":"ContainerDied","Data":"29ac81aa3b3f1c19561bb1bd81656ab13bc9fcfa6850994d8f4297c5c59bef44"} Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.391642 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvfs6" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.391657 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvfs6" event={"ID":"09778a24-7f9d-4e5f-8114-398ec482ccbd","Type":"ContainerDied","Data":"4ae098aa9217fcb5cca1d8b60315034ffdc8777a993f540d77f7b166eb2b21c3"} Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.391685 4744 scope.go:117] "RemoveContainer" containerID="29ac81aa3b3f1c19561bb1bd81656ab13bc9fcfa6850994d8f4297c5c59bef44" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.408827 4744 scope.go:117] "RemoveContainer" containerID="c1dde70858f7891ac64aa97ade3f9fde1294884db85d49f807f53b7526ddecaa" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.425874 4744 scope.go:117] "RemoveContainer" containerID="9b8bf0c93511742190b53e3231340ebb1eedfdd1969fed9d78e88753b4148b37" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.442571 4744 scope.go:117] "RemoveContainer" containerID="29ac81aa3b3f1c19561bb1bd81656ab13bc9fcfa6850994d8f4297c5c59bef44" Dec 05 20:25:55 crc kubenswrapper[4744]: E1205 20:25:55.442963 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29ac81aa3b3f1c19561bb1bd81656ab13bc9fcfa6850994d8f4297c5c59bef44\": container with ID starting with 29ac81aa3b3f1c19561bb1bd81656ab13bc9fcfa6850994d8f4297c5c59bef44 not found: ID does not exist" containerID="29ac81aa3b3f1c19561bb1bd81656ab13bc9fcfa6850994d8f4297c5c59bef44" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.442993 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29ac81aa3b3f1c19561bb1bd81656ab13bc9fcfa6850994d8f4297c5c59bef44"} err="failed to get container status \"29ac81aa3b3f1c19561bb1bd81656ab13bc9fcfa6850994d8f4297c5c59bef44\": rpc error: code = NotFound desc = could not find container \"29ac81aa3b3f1c19561bb1bd81656ab13bc9fcfa6850994d8f4297c5c59bef44\": container with ID starting with 29ac81aa3b3f1c19561bb1bd81656ab13bc9fcfa6850994d8f4297c5c59bef44 not found: ID does not exist" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.443013 4744 scope.go:117] "RemoveContainer" containerID="c1dde70858f7891ac64aa97ade3f9fde1294884db85d49f807f53b7526ddecaa" Dec 05 20:25:55 crc kubenswrapper[4744]: E1205 20:25:55.443383 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1dde70858f7891ac64aa97ade3f9fde1294884db85d49f807f53b7526ddecaa\": container with ID starting with c1dde70858f7891ac64aa97ade3f9fde1294884db85d49f807f53b7526ddecaa not found: ID does not exist" containerID="c1dde70858f7891ac64aa97ade3f9fde1294884db85d49f807f53b7526ddecaa" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.443422 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1dde70858f7891ac64aa97ade3f9fde1294884db85d49f807f53b7526ddecaa"} err="failed to get container status \"c1dde70858f7891ac64aa97ade3f9fde1294884db85d49f807f53b7526ddecaa\": rpc error: code = NotFound desc = could not find container \"c1dde70858f7891ac64aa97ade3f9fde1294884db85d49f807f53b7526ddecaa\": container with ID starting with c1dde70858f7891ac64aa97ade3f9fde1294884db85d49f807f53b7526ddecaa not found: ID does not exist" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.443449 4744 scope.go:117] "RemoveContainer" containerID="9b8bf0c93511742190b53e3231340ebb1eedfdd1969fed9d78e88753b4148b37" Dec 05 20:25:55 crc kubenswrapper[4744]: E1205 20:25:55.443714 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b8bf0c93511742190b53e3231340ebb1eedfdd1969fed9d78e88753b4148b37\": container with ID starting with 9b8bf0c93511742190b53e3231340ebb1eedfdd1969fed9d78e88753b4148b37 not found: ID does not exist" containerID="9b8bf0c93511742190b53e3231340ebb1eedfdd1969fed9d78e88753b4148b37" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.443736 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b8bf0c93511742190b53e3231340ebb1eedfdd1969fed9d78e88753b4148b37"} err="failed to get container status \"9b8bf0c93511742190b53e3231340ebb1eedfdd1969fed9d78e88753b4148b37\": rpc error: code = NotFound desc = could not find container \"9b8bf0c93511742190b53e3231340ebb1eedfdd1969fed9d78e88753b4148b37\": container with ID starting with 9b8bf0c93511742190b53e3231340ebb1eedfdd1969fed9d78e88753b4148b37 not found: ID does not exist" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.495315 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09778a24-7f9d-4e5f-8114-398ec482ccbd-catalog-content\") pod \"09778a24-7f9d-4e5f-8114-398ec482ccbd\" (UID: \"09778a24-7f9d-4e5f-8114-398ec482ccbd\") " Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.495389 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09778a24-7f9d-4e5f-8114-398ec482ccbd-utilities\") pod \"09778a24-7f9d-4e5f-8114-398ec482ccbd\" (UID: \"09778a24-7f9d-4e5f-8114-398ec482ccbd\") " Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.495436 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7st5j\" (UniqueName: \"kubernetes.io/projected/09778a24-7f9d-4e5f-8114-398ec482ccbd-kube-api-access-7st5j\") pod \"09778a24-7f9d-4e5f-8114-398ec482ccbd\" (UID: \"09778a24-7f9d-4e5f-8114-398ec482ccbd\") " Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.495622 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c822e5a4-a983-475b-95f4-0557534a89b6-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd\" (UID: \"c822e5a4-a983-475b-95f4-0557534a89b6\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.495682 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c822e5a4-a983-475b-95f4-0557534a89b6-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd\" (UID: \"c822e5a4-a983-475b-95f4-0557534a89b6\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.495732 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpbl9\" (UniqueName: \"kubernetes.io/projected/c822e5a4-a983-475b-95f4-0557534a89b6-kube-api-access-xpbl9\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd\" (UID: \"c822e5a4-a983-475b-95f4-0557534a89b6\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.496162 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09778a24-7f9d-4e5f-8114-398ec482ccbd-utilities" (OuterVolumeSpecName: "utilities") pod "09778a24-7f9d-4e5f-8114-398ec482ccbd" (UID: "09778a24-7f9d-4e5f-8114-398ec482ccbd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.501366 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09778a24-7f9d-4e5f-8114-398ec482ccbd-kube-api-access-7st5j" (OuterVolumeSpecName: "kube-api-access-7st5j") pod "09778a24-7f9d-4e5f-8114-398ec482ccbd" (UID: "09778a24-7f9d-4e5f-8114-398ec482ccbd"). InnerVolumeSpecName "kube-api-access-7st5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.547451 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09778a24-7f9d-4e5f-8114-398ec482ccbd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09778a24-7f9d-4e5f-8114-398ec482ccbd" (UID: "09778a24-7f9d-4e5f-8114-398ec482ccbd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.597397 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpbl9\" (UniqueName: \"kubernetes.io/projected/c822e5a4-a983-475b-95f4-0557534a89b6-kube-api-access-xpbl9\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd\" (UID: \"c822e5a4-a983-475b-95f4-0557534a89b6\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.597467 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c822e5a4-a983-475b-95f4-0557534a89b6-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd\" (UID: \"c822e5a4-a983-475b-95f4-0557534a89b6\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.597511 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c822e5a4-a983-475b-95f4-0557534a89b6-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd\" (UID: \"c822e5a4-a983-475b-95f4-0557534a89b6\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.597556 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09778a24-7f9d-4e5f-8114-398ec482ccbd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.597569 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09778a24-7f9d-4e5f-8114-398ec482ccbd-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.597578 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7st5j\" (UniqueName: \"kubernetes.io/projected/09778a24-7f9d-4e5f-8114-398ec482ccbd-kube-api-access-7st5j\") on node \"crc\" DevicePath \"\"" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.597964 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c822e5a4-a983-475b-95f4-0557534a89b6-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd\" (UID: \"c822e5a4-a983-475b-95f4-0557534a89b6\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.598044 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c822e5a4-a983-475b-95f4-0557534a89b6-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd\" (UID: \"c822e5a4-a983-475b-95f4-0557534a89b6\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.613583 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpbl9\" (UniqueName: \"kubernetes.io/projected/c822e5a4-a983-475b-95f4-0557534a89b6-kube-api-access-xpbl9\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd\" (UID: \"c822e5a4-a983-475b-95f4-0557534a89b6\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.651697 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.729983 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rvfs6"] Dec 05 20:25:55 crc kubenswrapper[4744]: I1205 20:25:55.736003 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rvfs6"] Dec 05 20:25:56 crc kubenswrapper[4744]: I1205 20:25:56.078999 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd"] Dec 05 20:25:56 crc kubenswrapper[4744]: I1205 20:25:56.089398 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09778a24-7f9d-4e5f-8114-398ec482ccbd" path="/var/lib/kubelet/pods/09778a24-7f9d-4e5f-8114-398ec482ccbd/volumes" Dec 05 20:25:56 crc kubenswrapper[4744]: I1205 20:25:56.403638 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" event={"ID":"c822e5a4-a983-475b-95f4-0557534a89b6","Type":"ContainerStarted","Data":"8c35dd073f1a8cdef813e6c043d87bd6c2978e5b9658fd4aad2093859c958ba5"} Dec 05 20:25:57 crc kubenswrapper[4744]: I1205 20:25:57.411341 4744 generic.go:334] "Generic (PLEG): container finished" podID="c822e5a4-a983-475b-95f4-0557534a89b6" containerID="2facc822e071e38a83d8cde0da2790adbf8c533f7bc8bfd93aeb8fe9ce3468bd" exitCode=0 Dec 05 20:25:57 crc kubenswrapper[4744]: I1205 20:25:57.411694 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" event={"ID":"c822e5a4-a983-475b-95f4-0557534a89b6","Type":"ContainerDied","Data":"2facc822e071e38a83d8cde0da2790adbf8c533f7bc8bfd93aeb8fe9ce3468bd"} Dec 05 20:26:00 crc kubenswrapper[4744]: I1205 20:26:00.402616 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-8gfzk" Dec 05 20:26:01 crc kubenswrapper[4744]: I1205 20:26:01.441055 4744 generic.go:334] "Generic (PLEG): container finished" podID="c822e5a4-a983-475b-95f4-0557534a89b6" containerID="30a7e8ee2ff70c2fed51e0e16b82d68df4f5c0de4a1ea3cbe408a297c54642c7" exitCode=0 Dec 05 20:26:01 crc kubenswrapper[4744]: I1205 20:26:01.441098 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" event={"ID":"c822e5a4-a983-475b-95f4-0557534a89b6","Type":"ContainerDied","Data":"30a7e8ee2ff70c2fed51e0e16b82d68df4f5c0de4a1ea3cbe408a297c54642c7"} Dec 05 20:26:02 crc kubenswrapper[4744]: I1205 20:26:02.447819 4744 generic.go:334] "Generic (PLEG): container finished" podID="c822e5a4-a983-475b-95f4-0557534a89b6" containerID="41167eb65e6bc486db76b1c3be5dc3ab8b1599b8e1fa938d95d993d6ecb54272" exitCode=0 Dec 05 20:26:02 crc kubenswrapper[4744]: I1205 20:26:02.447903 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" event={"ID":"c822e5a4-a983-475b-95f4-0557534a89b6","Type":"ContainerDied","Data":"41167eb65e6bc486db76b1c3be5dc3ab8b1599b8e1fa938d95d993d6ecb54272"} Dec 05 20:26:03 crc kubenswrapper[4744]: I1205 20:26:03.693455 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" Dec 05 20:26:03 crc kubenswrapper[4744]: I1205 20:26:03.822668 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpbl9\" (UniqueName: \"kubernetes.io/projected/c822e5a4-a983-475b-95f4-0557534a89b6-kube-api-access-xpbl9\") pod \"c822e5a4-a983-475b-95f4-0557534a89b6\" (UID: \"c822e5a4-a983-475b-95f4-0557534a89b6\") " Dec 05 20:26:03 crc kubenswrapper[4744]: I1205 20:26:03.822749 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c822e5a4-a983-475b-95f4-0557534a89b6-bundle\") pod \"c822e5a4-a983-475b-95f4-0557534a89b6\" (UID: \"c822e5a4-a983-475b-95f4-0557534a89b6\") " Dec 05 20:26:03 crc kubenswrapper[4744]: I1205 20:26:03.822814 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c822e5a4-a983-475b-95f4-0557534a89b6-util\") pod \"c822e5a4-a983-475b-95f4-0557534a89b6\" (UID: \"c822e5a4-a983-475b-95f4-0557534a89b6\") " Dec 05 20:26:03 crc kubenswrapper[4744]: I1205 20:26:03.824116 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c822e5a4-a983-475b-95f4-0557534a89b6-bundle" (OuterVolumeSpecName: "bundle") pod "c822e5a4-a983-475b-95f4-0557534a89b6" (UID: "c822e5a4-a983-475b-95f4-0557534a89b6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:26:03 crc kubenswrapper[4744]: I1205 20:26:03.827553 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c822e5a4-a983-475b-95f4-0557534a89b6-kube-api-access-xpbl9" (OuterVolumeSpecName: "kube-api-access-xpbl9") pod "c822e5a4-a983-475b-95f4-0557534a89b6" (UID: "c822e5a4-a983-475b-95f4-0557534a89b6"). InnerVolumeSpecName "kube-api-access-xpbl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:03 crc kubenswrapper[4744]: I1205 20:26:03.840096 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c822e5a4-a983-475b-95f4-0557534a89b6-util" (OuterVolumeSpecName: "util") pod "c822e5a4-a983-475b-95f4-0557534a89b6" (UID: "c822e5a4-a983-475b-95f4-0557534a89b6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:26:03 crc kubenswrapper[4744]: I1205 20:26:03.924203 4744 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c822e5a4-a983-475b-95f4-0557534a89b6-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:03 crc kubenswrapper[4744]: I1205 20:26:03.924243 4744 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c822e5a4-a983-475b-95f4-0557534a89b6-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:03 crc kubenswrapper[4744]: I1205 20:26:03.924260 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpbl9\" (UniqueName: \"kubernetes.io/projected/c822e5a4-a983-475b-95f4-0557534a89b6-kube-api-access-xpbl9\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:04 crc kubenswrapper[4744]: I1205 20:26:04.461299 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" event={"ID":"c822e5a4-a983-475b-95f4-0557534a89b6","Type":"ContainerDied","Data":"8c35dd073f1a8cdef813e6c043d87bd6c2978e5b9658fd4aad2093859c958ba5"} Dec 05 20:26:04 crc kubenswrapper[4744]: I1205 20:26:04.461343 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c35dd073f1a8cdef813e6c043d87bd6c2978e5b9658fd4aad2093859c958ba5" Dec 05 20:26:04 crc kubenswrapper[4744]: I1205 20:26:04.461371 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.538900 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-jhzvm"] Dec 05 20:26:08 crc kubenswrapper[4744]: E1205 20:26:08.539723 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c822e5a4-a983-475b-95f4-0557534a89b6" containerName="util" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.539739 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c822e5a4-a983-475b-95f4-0557534a89b6" containerName="util" Dec 05 20:26:08 crc kubenswrapper[4744]: E1205 20:26:08.539757 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c822e5a4-a983-475b-95f4-0557534a89b6" containerName="pull" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.539764 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c822e5a4-a983-475b-95f4-0557534a89b6" containerName="pull" Dec 05 20:26:08 crc kubenswrapper[4744]: E1205 20:26:08.539787 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c822e5a4-a983-475b-95f4-0557534a89b6" containerName="extract" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.539795 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c822e5a4-a983-475b-95f4-0557534a89b6" containerName="extract" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.539913 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c822e5a4-a983-475b-95f4-0557534a89b6" containerName="extract" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.540466 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-jhzvm" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.542341 4744 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-kn68x" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.542409 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.542548 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.551045 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-jhzvm"] Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.584365 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z57tz\" (UniqueName: \"kubernetes.io/projected/4d1b8c89-5822-4e43-9378-57baa041e444-kube-api-access-z57tz\") pod \"cert-manager-operator-controller-manager-64cf6dff88-jhzvm\" (UID: \"4d1b8c89-5822-4e43-9378-57baa041e444\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-jhzvm" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.584462 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d1b8c89-5822-4e43-9378-57baa041e444-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-jhzvm\" (UID: \"4d1b8c89-5822-4e43-9378-57baa041e444\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-jhzvm" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.685694 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z57tz\" (UniqueName: \"kubernetes.io/projected/4d1b8c89-5822-4e43-9378-57baa041e444-kube-api-access-z57tz\") pod \"cert-manager-operator-controller-manager-64cf6dff88-jhzvm\" (UID: \"4d1b8c89-5822-4e43-9378-57baa041e444\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-jhzvm" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.685773 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d1b8c89-5822-4e43-9378-57baa041e444-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-jhzvm\" (UID: \"4d1b8c89-5822-4e43-9378-57baa041e444\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-jhzvm" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.686351 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d1b8c89-5822-4e43-9378-57baa041e444-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-jhzvm\" (UID: \"4d1b8c89-5822-4e43-9378-57baa041e444\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-jhzvm" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.724448 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z57tz\" (UniqueName: \"kubernetes.io/projected/4d1b8c89-5822-4e43-9378-57baa041e444-kube-api-access-z57tz\") pod \"cert-manager-operator-controller-manager-64cf6dff88-jhzvm\" (UID: \"4d1b8c89-5822-4e43-9378-57baa041e444\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-jhzvm" Dec 05 20:26:08 crc kubenswrapper[4744]: I1205 20:26:08.857582 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-jhzvm" Dec 05 20:26:09 crc kubenswrapper[4744]: I1205 20:26:09.294579 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-jhzvm"] Dec 05 20:26:09 crc kubenswrapper[4744]: I1205 20:26:09.490454 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-jhzvm" event={"ID":"4d1b8c89-5822-4e43-9378-57baa041e444","Type":"ContainerStarted","Data":"d8adff0b2bd3d137d2eea9d0f56308ccc25affbcbec8da9994ed2d761f9bd11f"} Dec 05 20:26:14 crc kubenswrapper[4744]: I1205 20:26:14.541852 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-jhzvm" event={"ID":"4d1b8c89-5822-4e43-9378-57baa041e444","Type":"ContainerStarted","Data":"d9dcb5d36af421283b61a3058e17886abc794c481dd68a2d3249a7e1ed62fc28"} Dec 05 20:26:14 crc kubenswrapper[4744]: I1205 20:26:14.566803 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-jhzvm" podStartSLOduration=2.48724214 podStartE2EDuration="6.566786752s" podCreationTimestamp="2025-12-05 20:26:08 +0000 UTC" firstStartedPulling="2025-12-05 20:26:09.305925172 +0000 UTC m=+939.535736540" lastFinishedPulling="2025-12-05 20:26:13.385469784 +0000 UTC m=+943.615281152" observedRunningTime="2025-12-05 20:26:14.562245862 +0000 UTC m=+944.792057230" watchObservedRunningTime="2025-12-05 20:26:14.566786752 +0000 UTC m=+944.796598130" Dec 05 20:26:19 crc kubenswrapper[4744]: I1205 20:26:19.597469 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-qxrzz"] Dec 05 20:26:19 crc kubenswrapper[4744]: I1205 20:26:19.599063 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-qxrzz" Dec 05 20:26:19 crc kubenswrapper[4744]: I1205 20:26:19.601492 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 05 20:26:19 crc kubenswrapper[4744]: I1205 20:26:19.601584 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 05 20:26:19 crc kubenswrapper[4744]: I1205 20:26:19.602165 4744 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-blzvg" Dec 05 20:26:19 crc kubenswrapper[4744]: I1205 20:26:19.610847 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-qxrzz"] Dec 05 20:26:19 crc kubenswrapper[4744]: I1205 20:26:19.655200 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e86d2f0d-d888-42bd-9781-46aecfcf2a65-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-qxrzz\" (UID: \"e86d2f0d-d888-42bd-9781-46aecfcf2a65\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-qxrzz" Dec 05 20:26:19 crc kubenswrapper[4744]: I1205 20:26:19.655319 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nlgl\" (UniqueName: \"kubernetes.io/projected/e86d2f0d-d888-42bd-9781-46aecfcf2a65-kube-api-access-9nlgl\") pod \"cert-manager-webhook-f4fb5df64-qxrzz\" (UID: \"e86d2f0d-d888-42bd-9781-46aecfcf2a65\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-qxrzz" Dec 05 20:26:19 crc kubenswrapper[4744]: I1205 20:26:19.756978 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e86d2f0d-d888-42bd-9781-46aecfcf2a65-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-qxrzz\" (UID: \"e86d2f0d-d888-42bd-9781-46aecfcf2a65\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-qxrzz" Dec 05 20:26:19 crc kubenswrapper[4744]: I1205 20:26:19.757378 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nlgl\" (UniqueName: \"kubernetes.io/projected/e86d2f0d-d888-42bd-9781-46aecfcf2a65-kube-api-access-9nlgl\") pod \"cert-manager-webhook-f4fb5df64-qxrzz\" (UID: \"e86d2f0d-d888-42bd-9781-46aecfcf2a65\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-qxrzz" Dec 05 20:26:19 crc kubenswrapper[4744]: I1205 20:26:19.780322 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e86d2f0d-d888-42bd-9781-46aecfcf2a65-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-qxrzz\" (UID: \"e86d2f0d-d888-42bd-9781-46aecfcf2a65\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-qxrzz" Dec 05 20:26:19 crc kubenswrapper[4744]: I1205 20:26:19.780920 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nlgl\" (UniqueName: \"kubernetes.io/projected/e86d2f0d-d888-42bd-9781-46aecfcf2a65-kube-api-access-9nlgl\") pod \"cert-manager-webhook-f4fb5df64-qxrzz\" (UID: \"e86d2f0d-d888-42bd-9781-46aecfcf2a65\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-qxrzz" Dec 05 20:26:19 crc kubenswrapper[4744]: I1205 20:26:19.921223 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-qxrzz" Dec 05 20:26:20 crc kubenswrapper[4744]: I1205 20:26:20.329045 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-fqrqd"] Dec 05 20:26:20 crc kubenswrapper[4744]: I1205 20:26:20.330561 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-fqrqd" Dec 05 20:26:20 crc kubenswrapper[4744]: I1205 20:26:20.336952 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-fqrqd"] Dec 05 20:26:20 crc kubenswrapper[4744]: I1205 20:26:20.337272 4744 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-kw7sl" Dec 05 20:26:20 crc kubenswrapper[4744]: I1205 20:26:20.367999 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm522\" (UniqueName: \"kubernetes.io/projected/69280567-04f6-4557-8203-d729b6ec814e-kube-api-access-wm522\") pod \"cert-manager-cainjector-855d9ccff4-fqrqd\" (UID: \"69280567-04f6-4557-8203-d729b6ec814e\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-fqrqd" Dec 05 20:26:20 crc kubenswrapper[4744]: I1205 20:26:20.368223 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69280567-04f6-4557-8203-d729b6ec814e-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-fqrqd\" (UID: \"69280567-04f6-4557-8203-d729b6ec814e\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-fqrqd" Dec 05 20:26:20 crc kubenswrapper[4744]: I1205 20:26:20.382693 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-qxrzz"] Dec 05 20:26:20 crc kubenswrapper[4744]: I1205 20:26:20.469114 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm522\" (UniqueName: \"kubernetes.io/projected/69280567-04f6-4557-8203-d729b6ec814e-kube-api-access-wm522\") pod \"cert-manager-cainjector-855d9ccff4-fqrqd\" (UID: \"69280567-04f6-4557-8203-d729b6ec814e\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-fqrqd" Dec 05 20:26:20 crc kubenswrapper[4744]: I1205 20:26:20.469210 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69280567-04f6-4557-8203-d729b6ec814e-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-fqrqd\" (UID: \"69280567-04f6-4557-8203-d729b6ec814e\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-fqrqd" Dec 05 20:26:20 crc kubenswrapper[4744]: I1205 20:26:20.484845 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm522\" (UniqueName: \"kubernetes.io/projected/69280567-04f6-4557-8203-d729b6ec814e-kube-api-access-wm522\") pod \"cert-manager-cainjector-855d9ccff4-fqrqd\" (UID: \"69280567-04f6-4557-8203-d729b6ec814e\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-fqrqd" Dec 05 20:26:20 crc kubenswrapper[4744]: I1205 20:26:20.485825 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69280567-04f6-4557-8203-d729b6ec814e-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-fqrqd\" (UID: \"69280567-04f6-4557-8203-d729b6ec814e\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-fqrqd" Dec 05 20:26:20 crc kubenswrapper[4744]: I1205 20:26:20.578176 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-qxrzz" event={"ID":"e86d2f0d-d888-42bd-9781-46aecfcf2a65","Type":"ContainerStarted","Data":"60afea4b254b792c1a4aa33d342b1da5b95d4cd2c50406103360ecfddf386f31"} Dec 05 20:26:20 crc kubenswrapper[4744]: I1205 20:26:20.657641 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-fqrqd" Dec 05 20:26:20 crc kubenswrapper[4744]: I1205 20:26:20.946766 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-fqrqd"] Dec 05 20:26:21 crc kubenswrapper[4744]: I1205 20:26:21.584797 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-fqrqd" event={"ID":"69280567-04f6-4557-8203-d729b6ec814e","Type":"ContainerStarted","Data":"db56cfe8510bacb625099d1805a980d2a07455c8857133799b63ff953937695c"} Dec 05 20:26:29 crc kubenswrapper[4744]: I1205 20:26:29.646036 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-fqrqd" event={"ID":"69280567-04f6-4557-8203-d729b6ec814e","Type":"ContainerStarted","Data":"6007523acc10c9021c29825b82a6d5a23b75b93caa24b6368fd016922264d548"} Dec 05 20:26:29 crc kubenswrapper[4744]: I1205 20:26:29.648999 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-qxrzz" event={"ID":"e86d2f0d-d888-42bd-9781-46aecfcf2a65","Type":"ContainerStarted","Data":"844d344a34ea69f735aff1ecab104d8fe186b134bc12d7ae119ae53f29e4bd15"} Dec 05 20:26:29 crc kubenswrapper[4744]: I1205 20:26:29.649183 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-qxrzz" Dec 05 20:26:29 crc kubenswrapper[4744]: I1205 20:26:29.690723 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-fqrqd" podStartSLOduration=1.475881094 podStartE2EDuration="9.690703589s" podCreationTimestamp="2025-12-05 20:26:20 +0000 UTC" firstStartedPulling="2025-12-05 20:26:20.969400681 +0000 UTC m=+951.199212049" lastFinishedPulling="2025-12-05 20:26:29.184223176 +0000 UTC m=+959.414034544" observedRunningTime="2025-12-05 20:26:29.662667645 +0000 UTC m=+959.892479023" watchObservedRunningTime="2025-12-05 20:26:29.690703589 +0000 UTC m=+959.920514957" Dec 05 20:26:29 crc kubenswrapper[4744]: I1205 20:26:29.692065 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-qxrzz" podStartSLOduration=1.906926122 podStartE2EDuration="10.692055409s" podCreationTimestamp="2025-12-05 20:26:19 +0000 UTC" firstStartedPulling="2025-12-05 20:26:20.394863345 +0000 UTC m=+950.624674713" lastFinishedPulling="2025-12-05 20:26:29.179992622 +0000 UTC m=+959.409804000" observedRunningTime="2025-12-05 20:26:29.688599202 +0000 UTC m=+959.918410560" watchObservedRunningTime="2025-12-05 20:26:29.692055409 +0000 UTC m=+959.921866787" Dec 05 20:26:34 crc kubenswrapper[4744]: I1205 20:26:34.923980 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-qxrzz" Dec 05 20:26:36 crc kubenswrapper[4744]: I1205 20:26:36.178010 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-t8449"] Dec 05 20:26:36 crc kubenswrapper[4744]: I1205 20:26:36.179576 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-t8449" Dec 05 20:26:36 crc kubenswrapper[4744]: I1205 20:26:36.183514 4744 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-m4gqt" Dec 05 20:26:36 crc kubenswrapper[4744]: I1205 20:26:36.192828 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-t8449"] Dec 05 20:26:36 crc kubenswrapper[4744]: I1205 20:26:36.291760 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5275d248-c1ed-4aab-a01f-1e7c65cfc66a-bound-sa-token\") pod \"cert-manager-86cb77c54b-t8449\" (UID: \"5275d248-c1ed-4aab-a01f-1e7c65cfc66a\") " pod="cert-manager/cert-manager-86cb77c54b-t8449" Dec 05 20:26:36 crc kubenswrapper[4744]: I1205 20:26:36.291883 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqnx9\" (UniqueName: \"kubernetes.io/projected/5275d248-c1ed-4aab-a01f-1e7c65cfc66a-kube-api-access-dqnx9\") pod \"cert-manager-86cb77c54b-t8449\" (UID: \"5275d248-c1ed-4aab-a01f-1e7c65cfc66a\") " pod="cert-manager/cert-manager-86cb77c54b-t8449" Dec 05 20:26:36 crc kubenswrapper[4744]: I1205 20:26:36.393528 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5275d248-c1ed-4aab-a01f-1e7c65cfc66a-bound-sa-token\") pod \"cert-manager-86cb77c54b-t8449\" (UID: \"5275d248-c1ed-4aab-a01f-1e7c65cfc66a\") " pod="cert-manager/cert-manager-86cb77c54b-t8449" Dec 05 20:26:36 crc kubenswrapper[4744]: I1205 20:26:36.393613 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqnx9\" (UniqueName: \"kubernetes.io/projected/5275d248-c1ed-4aab-a01f-1e7c65cfc66a-kube-api-access-dqnx9\") pod \"cert-manager-86cb77c54b-t8449\" (UID: \"5275d248-c1ed-4aab-a01f-1e7c65cfc66a\") " pod="cert-manager/cert-manager-86cb77c54b-t8449" Dec 05 20:26:36 crc kubenswrapper[4744]: I1205 20:26:36.411761 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5275d248-c1ed-4aab-a01f-1e7c65cfc66a-bound-sa-token\") pod \"cert-manager-86cb77c54b-t8449\" (UID: \"5275d248-c1ed-4aab-a01f-1e7c65cfc66a\") " pod="cert-manager/cert-manager-86cb77c54b-t8449" Dec 05 20:26:36 crc kubenswrapper[4744]: I1205 20:26:36.418083 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqnx9\" (UniqueName: \"kubernetes.io/projected/5275d248-c1ed-4aab-a01f-1e7c65cfc66a-kube-api-access-dqnx9\") pod \"cert-manager-86cb77c54b-t8449\" (UID: \"5275d248-c1ed-4aab-a01f-1e7c65cfc66a\") " pod="cert-manager/cert-manager-86cb77c54b-t8449" Dec 05 20:26:36 crc kubenswrapper[4744]: I1205 20:26:36.511458 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-t8449" Dec 05 20:26:36 crc kubenswrapper[4744]: I1205 20:26:36.916998 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-t8449"] Dec 05 20:26:37 crc kubenswrapper[4744]: I1205 20:26:37.704608 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-t8449" event={"ID":"5275d248-c1ed-4aab-a01f-1e7c65cfc66a","Type":"ContainerStarted","Data":"d4de4d8cca7c04a28896763c0662466eef6049476284c8bf40460c1f80fc4c18"} Dec 05 20:26:37 crc kubenswrapper[4744]: I1205 20:26:37.705086 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-t8449" event={"ID":"5275d248-c1ed-4aab-a01f-1e7c65cfc66a","Type":"ContainerStarted","Data":"ad9529b8eae766d7597af8496636af3bde0a715f11dd866ea83f828611be76d1"} Dec 05 20:26:48 crc kubenswrapper[4744]: I1205 20:26:48.712951 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-t8449" podStartSLOduration=12.712930598 podStartE2EDuration="12.712930598s" podCreationTimestamp="2025-12-05 20:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:26:37.727861463 +0000 UTC m=+967.957672881" watchObservedRunningTime="2025-12-05 20:26:48.712930598 +0000 UTC m=+978.942741976" Dec 05 20:26:48 crc kubenswrapper[4744]: I1205 20:26:48.717354 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7zdwq"] Dec 05 20:26:48 crc kubenswrapper[4744]: I1205 20:26:48.718370 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7zdwq" Dec 05 20:26:48 crc kubenswrapper[4744]: I1205 20:26:48.720635 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 05 20:26:48 crc kubenswrapper[4744]: I1205 20:26:48.720842 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 05 20:26:48 crc kubenswrapper[4744]: I1205 20:26:48.721150 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-bwx8k" Dec 05 20:26:48 crc kubenswrapper[4744]: I1205 20:26:48.732237 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7zdwq"] Dec 05 20:26:48 crc kubenswrapper[4744]: I1205 20:26:48.744362 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9skd\" (UniqueName: \"kubernetes.io/projected/d4f20252-c046-47df-b4c3-a9438fa7068b-kube-api-access-l9skd\") pod \"openstack-operator-index-7zdwq\" (UID: \"d4f20252-c046-47df-b4c3-a9438fa7068b\") " pod="openstack-operators/openstack-operator-index-7zdwq" Dec 05 20:26:48 crc kubenswrapper[4744]: I1205 20:26:48.845935 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9skd\" (UniqueName: \"kubernetes.io/projected/d4f20252-c046-47df-b4c3-a9438fa7068b-kube-api-access-l9skd\") pod \"openstack-operator-index-7zdwq\" (UID: \"d4f20252-c046-47df-b4c3-a9438fa7068b\") " pod="openstack-operators/openstack-operator-index-7zdwq" Dec 05 20:26:48 crc kubenswrapper[4744]: I1205 20:26:48.868624 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9skd\" (UniqueName: \"kubernetes.io/projected/d4f20252-c046-47df-b4c3-a9438fa7068b-kube-api-access-l9skd\") pod \"openstack-operator-index-7zdwq\" (UID: \"d4f20252-c046-47df-b4c3-a9438fa7068b\") " pod="openstack-operators/openstack-operator-index-7zdwq" Dec 05 20:26:49 crc kubenswrapper[4744]: I1205 20:26:49.049669 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7zdwq" Dec 05 20:26:49 crc kubenswrapper[4744]: I1205 20:26:49.517467 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7zdwq"] Dec 05 20:26:49 crc kubenswrapper[4744]: I1205 20:26:49.811272 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7zdwq" event={"ID":"d4f20252-c046-47df-b4c3-a9438fa7068b","Type":"ContainerStarted","Data":"f465e2436963af246bd248de620a20e4bf0322c99693a8391f8602ae86807aa1"} Dec 05 20:26:51 crc kubenswrapper[4744]: I1205 20:26:51.498484 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7zdwq"] Dec 05 20:26:52 crc kubenswrapper[4744]: I1205 20:26:52.117762 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nxws4"] Dec 05 20:26:52 crc kubenswrapper[4744]: I1205 20:26:52.119515 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nxws4" Dec 05 20:26:52 crc kubenswrapper[4744]: I1205 20:26:52.126745 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nxws4"] Dec 05 20:26:52 crc kubenswrapper[4744]: I1205 20:26:52.211267 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmq5d\" (UniqueName: \"kubernetes.io/projected/9b8a4464-7411-4fb6-9078-c922921d4a65-kube-api-access-vmq5d\") pod \"openstack-operator-index-nxws4\" (UID: \"9b8a4464-7411-4fb6-9078-c922921d4a65\") " pod="openstack-operators/openstack-operator-index-nxws4" Dec 05 20:26:52 crc kubenswrapper[4744]: I1205 20:26:52.313373 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmq5d\" (UniqueName: \"kubernetes.io/projected/9b8a4464-7411-4fb6-9078-c922921d4a65-kube-api-access-vmq5d\") pod \"openstack-operator-index-nxws4\" (UID: \"9b8a4464-7411-4fb6-9078-c922921d4a65\") " pod="openstack-operators/openstack-operator-index-nxws4" Dec 05 20:26:52 crc kubenswrapper[4744]: I1205 20:26:52.339696 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmq5d\" (UniqueName: \"kubernetes.io/projected/9b8a4464-7411-4fb6-9078-c922921d4a65-kube-api-access-vmq5d\") pod \"openstack-operator-index-nxws4\" (UID: \"9b8a4464-7411-4fb6-9078-c922921d4a65\") " pod="openstack-operators/openstack-operator-index-nxws4" Dec 05 20:26:52 crc kubenswrapper[4744]: I1205 20:26:52.448826 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nxws4" Dec 05 20:26:52 crc kubenswrapper[4744]: I1205 20:26:52.829401 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7zdwq" event={"ID":"d4f20252-c046-47df-b4c3-a9438fa7068b","Type":"ContainerStarted","Data":"f489a849332fa5d79b15a539350ba9a2f03f4a094d23c04e0cca656ba4c75104"} Dec 05 20:26:52 crc kubenswrapper[4744]: I1205 20:26:52.829827 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-7zdwq" podUID="d4f20252-c046-47df-b4c3-a9438fa7068b" containerName="registry-server" containerID="cri-o://f489a849332fa5d79b15a539350ba9a2f03f4a094d23c04e0cca656ba4c75104" gracePeriod=2 Dec 05 20:26:52 crc kubenswrapper[4744]: I1205 20:26:52.849072 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7zdwq" podStartSLOduration=2.592287006 podStartE2EDuration="4.849043508s" podCreationTimestamp="2025-12-05 20:26:48 +0000 UTC" firstStartedPulling="2025-12-05 20:26:49.524605105 +0000 UTC m=+979.754416473" lastFinishedPulling="2025-12-05 20:26:51.781361607 +0000 UTC m=+982.011172975" observedRunningTime="2025-12-05 20:26:52.847019973 +0000 UTC m=+983.076831361" watchObservedRunningTime="2025-12-05 20:26:52.849043508 +0000 UTC m=+983.078854917" Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.000404 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nxws4"] Dec 05 20:26:53 crc kubenswrapper[4744]: W1205 20:26:53.002131 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8a4464_7411_4fb6_9078_c922921d4a65.slice/crio-ac243b2f66f3b3dda5b21407272f0c9990a6e859e3507577ae200991120a6ee0 WatchSource:0}: Error finding container ac243b2f66f3b3dda5b21407272f0c9990a6e859e3507577ae200991120a6ee0: Status 404 returned error can't find the container with id ac243b2f66f3b3dda5b21407272f0c9990a6e859e3507577ae200991120a6ee0 Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.293518 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7zdwq" Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.334704 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9skd\" (UniqueName: \"kubernetes.io/projected/d4f20252-c046-47df-b4c3-a9438fa7068b-kube-api-access-l9skd\") pod \"d4f20252-c046-47df-b4c3-a9438fa7068b\" (UID: \"d4f20252-c046-47df-b4c3-a9438fa7068b\") " Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.340093 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4f20252-c046-47df-b4c3-a9438fa7068b-kube-api-access-l9skd" (OuterVolumeSpecName: "kube-api-access-l9skd") pod "d4f20252-c046-47df-b4c3-a9438fa7068b" (UID: "d4f20252-c046-47df-b4c3-a9438fa7068b"). InnerVolumeSpecName "kube-api-access-l9skd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.435972 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9skd\" (UniqueName: \"kubernetes.io/projected/d4f20252-c046-47df-b4c3-a9438fa7068b-kube-api-access-l9skd\") on node \"crc\" DevicePath \"\"" Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.840668 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nxws4" event={"ID":"9b8a4464-7411-4fb6-9078-c922921d4a65","Type":"ContainerStarted","Data":"7b89d5ebd43002a48e05417d5132a03814fb64691a893793d581184a4d632233"} Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.840717 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nxws4" event={"ID":"9b8a4464-7411-4fb6-9078-c922921d4a65","Type":"ContainerStarted","Data":"ac243b2f66f3b3dda5b21407272f0c9990a6e859e3507577ae200991120a6ee0"} Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.842753 4744 generic.go:334] "Generic (PLEG): container finished" podID="d4f20252-c046-47df-b4c3-a9438fa7068b" containerID="f489a849332fa5d79b15a539350ba9a2f03f4a094d23c04e0cca656ba4c75104" exitCode=0 Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.842791 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7zdwq" event={"ID":"d4f20252-c046-47df-b4c3-a9438fa7068b","Type":"ContainerDied","Data":"f489a849332fa5d79b15a539350ba9a2f03f4a094d23c04e0cca656ba4c75104"} Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.842812 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7zdwq" event={"ID":"d4f20252-c046-47df-b4c3-a9438fa7068b","Type":"ContainerDied","Data":"f465e2436963af246bd248de620a20e4bf0322c99693a8391f8602ae86807aa1"} Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.842819 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7zdwq" Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.842833 4744 scope.go:117] "RemoveContainer" containerID="f489a849332fa5d79b15a539350ba9a2f03f4a094d23c04e0cca656ba4c75104" Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.864979 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nxws4" podStartSLOduration=1.820336067 podStartE2EDuration="1.864948268s" podCreationTimestamp="2025-12-05 20:26:52 +0000 UTC" firstStartedPulling="2025-12-05 20:26:53.005728403 +0000 UTC m=+983.235539811" lastFinishedPulling="2025-12-05 20:26:53.050340644 +0000 UTC m=+983.280152012" observedRunningTime="2025-12-05 20:26:53.861167534 +0000 UTC m=+984.090978912" watchObservedRunningTime="2025-12-05 20:26:53.864948268 +0000 UTC m=+984.094759636" Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.880736 4744 scope.go:117] "RemoveContainer" containerID="f489a849332fa5d79b15a539350ba9a2f03f4a094d23c04e0cca656ba4c75104" Dec 05 20:26:53 crc kubenswrapper[4744]: E1205 20:26:53.881335 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f489a849332fa5d79b15a539350ba9a2f03f4a094d23c04e0cca656ba4c75104\": container with ID starting with f489a849332fa5d79b15a539350ba9a2f03f4a094d23c04e0cca656ba4c75104 not found: ID does not exist" containerID="f489a849332fa5d79b15a539350ba9a2f03f4a094d23c04e0cca656ba4c75104" Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.881404 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f489a849332fa5d79b15a539350ba9a2f03f4a094d23c04e0cca656ba4c75104"} err="failed to get container status \"f489a849332fa5d79b15a539350ba9a2f03f4a094d23c04e0cca656ba4c75104\": rpc error: code = NotFound desc = could not find container \"f489a849332fa5d79b15a539350ba9a2f03f4a094d23c04e0cca656ba4c75104\": container with ID starting with f489a849332fa5d79b15a539350ba9a2f03f4a094d23c04e0cca656ba4c75104 not found: ID does not exist" Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.887143 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7zdwq"] Dec 05 20:26:53 crc kubenswrapper[4744]: I1205 20:26:53.896551 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-7zdwq"] Dec 05 20:26:54 crc kubenswrapper[4744]: I1205 20:26:54.090237 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4f20252-c046-47df-b4c3-a9438fa7068b" path="/var/lib/kubelet/pods/d4f20252-c046-47df-b4c3-a9438fa7068b/volumes" Dec 05 20:27:02 crc kubenswrapper[4744]: I1205 20:27:02.450210 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-nxws4" Dec 05 20:27:02 crc kubenswrapper[4744]: I1205 20:27:02.450973 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-nxws4" Dec 05 20:27:02 crc kubenswrapper[4744]: I1205 20:27:02.476523 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-nxws4" Dec 05 20:27:02 crc kubenswrapper[4744]: I1205 20:27:02.963572 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-nxws4" Dec 05 20:27:09 crc kubenswrapper[4744]: I1205 20:27:09.840947 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6"] Dec 05 20:27:09 crc kubenswrapper[4744]: E1205 20:27:09.841916 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f20252-c046-47df-b4c3-a9438fa7068b" containerName="registry-server" Dec 05 20:27:09 crc kubenswrapper[4744]: I1205 20:27:09.841935 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f20252-c046-47df-b4c3-a9438fa7068b" containerName="registry-server" Dec 05 20:27:09 crc kubenswrapper[4744]: I1205 20:27:09.842109 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4f20252-c046-47df-b4c3-a9438fa7068b" containerName="registry-server" Dec 05 20:27:09 crc kubenswrapper[4744]: I1205 20:27:09.843454 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" Dec 05 20:27:09 crc kubenswrapper[4744]: I1205 20:27:09.846032 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-w9k4r" Dec 05 20:27:09 crc kubenswrapper[4744]: I1205 20:27:09.853132 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6"] Dec 05 20:27:09 crc kubenswrapper[4744]: I1205 20:27:09.881045 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b06ca702-ab58-4297-ab66-f8fbc71358e5-bundle\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6\" (UID: \"b06ca702-ab58-4297-ab66-f8fbc71358e5\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" Dec 05 20:27:09 crc kubenswrapper[4744]: I1205 20:27:09.881174 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dts4\" (UniqueName: \"kubernetes.io/projected/b06ca702-ab58-4297-ab66-f8fbc71358e5-kube-api-access-5dts4\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6\" (UID: \"b06ca702-ab58-4297-ab66-f8fbc71358e5\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" Dec 05 20:27:09 crc kubenswrapper[4744]: I1205 20:27:09.881221 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b06ca702-ab58-4297-ab66-f8fbc71358e5-util\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6\" (UID: \"b06ca702-ab58-4297-ab66-f8fbc71358e5\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" Dec 05 20:27:09 crc kubenswrapper[4744]: I1205 20:27:09.982670 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dts4\" (UniqueName: \"kubernetes.io/projected/b06ca702-ab58-4297-ab66-f8fbc71358e5-kube-api-access-5dts4\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6\" (UID: \"b06ca702-ab58-4297-ab66-f8fbc71358e5\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" Dec 05 20:27:09 crc kubenswrapper[4744]: I1205 20:27:09.982786 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b06ca702-ab58-4297-ab66-f8fbc71358e5-util\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6\" (UID: \"b06ca702-ab58-4297-ab66-f8fbc71358e5\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" Dec 05 20:27:09 crc kubenswrapper[4744]: I1205 20:27:09.982872 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b06ca702-ab58-4297-ab66-f8fbc71358e5-bundle\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6\" (UID: \"b06ca702-ab58-4297-ab66-f8fbc71358e5\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" Dec 05 20:27:09 crc kubenswrapper[4744]: I1205 20:27:09.983276 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b06ca702-ab58-4297-ab66-f8fbc71358e5-util\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6\" (UID: \"b06ca702-ab58-4297-ab66-f8fbc71358e5\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" Dec 05 20:27:09 crc kubenswrapper[4744]: I1205 20:27:09.983640 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b06ca702-ab58-4297-ab66-f8fbc71358e5-bundle\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6\" (UID: \"b06ca702-ab58-4297-ab66-f8fbc71358e5\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" Dec 05 20:27:10 crc kubenswrapper[4744]: I1205 20:27:10.016108 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dts4\" (UniqueName: \"kubernetes.io/projected/b06ca702-ab58-4297-ab66-f8fbc71358e5-kube-api-access-5dts4\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6\" (UID: \"b06ca702-ab58-4297-ab66-f8fbc71358e5\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" Dec 05 20:27:10 crc kubenswrapper[4744]: I1205 20:27:10.174519 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" Dec 05 20:27:10 crc kubenswrapper[4744]: I1205 20:27:10.412670 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6"] Dec 05 20:27:10 crc kubenswrapper[4744]: I1205 20:27:10.980497 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" event={"ID":"b06ca702-ab58-4297-ab66-f8fbc71358e5","Type":"ContainerStarted","Data":"913f13c998621ab81ae76c8716c1fa35fb67de1078d7b2a91a19b1d0dcaa6b17"} Dec 05 20:27:11 crc kubenswrapper[4744]: I1205 20:27:11.990022 4744 generic.go:334] "Generic (PLEG): container finished" podID="b06ca702-ab58-4297-ab66-f8fbc71358e5" containerID="aeb5a01cb8d950ad1b2a810bb68f8ad6f420de4a629c60b649e1d0ca70e08063" exitCode=0 Dec 05 20:27:11 crc kubenswrapper[4744]: I1205 20:27:11.990071 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" event={"ID":"b06ca702-ab58-4297-ab66-f8fbc71358e5","Type":"ContainerDied","Data":"aeb5a01cb8d950ad1b2a810bb68f8ad6f420de4a629c60b649e1d0ca70e08063"} Dec 05 20:27:13 crc kubenswrapper[4744]: I1205 20:27:13.001402 4744 generic.go:334] "Generic (PLEG): container finished" podID="b06ca702-ab58-4297-ab66-f8fbc71358e5" containerID="7077f973e017a84b0bfe91ad341c64acbf738e65f3fd158b6e8e5584b8f7d8ab" exitCode=0 Dec 05 20:27:13 crc kubenswrapper[4744]: I1205 20:27:13.001985 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" event={"ID":"b06ca702-ab58-4297-ab66-f8fbc71358e5","Type":"ContainerDied","Data":"7077f973e017a84b0bfe91ad341c64acbf738e65f3fd158b6e8e5584b8f7d8ab"} Dec 05 20:27:14 crc kubenswrapper[4744]: I1205 20:27:14.014243 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" event={"ID":"b06ca702-ab58-4297-ab66-f8fbc71358e5","Type":"ContainerDied","Data":"d5caf7d76dbe45d3a3a610152a5b8be6da8cbf9fba50a4473a7244073452c17e"} Dec 05 20:27:14 crc kubenswrapper[4744]: I1205 20:27:14.014336 4744 generic.go:334] "Generic (PLEG): container finished" podID="b06ca702-ab58-4297-ab66-f8fbc71358e5" containerID="d5caf7d76dbe45d3a3a610152a5b8be6da8cbf9fba50a4473a7244073452c17e" exitCode=0 Dec 05 20:27:15 crc kubenswrapper[4744]: I1205 20:27:15.357146 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" Dec 05 20:27:15 crc kubenswrapper[4744]: I1205 20:27:15.458988 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b06ca702-ab58-4297-ab66-f8fbc71358e5-util\") pod \"b06ca702-ab58-4297-ab66-f8fbc71358e5\" (UID: \"b06ca702-ab58-4297-ab66-f8fbc71358e5\") " Dec 05 20:27:15 crc kubenswrapper[4744]: I1205 20:27:15.459118 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dts4\" (UniqueName: \"kubernetes.io/projected/b06ca702-ab58-4297-ab66-f8fbc71358e5-kube-api-access-5dts4\") pod \"b06ca702-ab58-4297-ab66-f8fbc71358e5\" (UID: \"b06ca702-ab58-4297-ab66-f8fbc71358e5\") " Dec 05 20:27:15 crc kubenswrapper[4744]: I1205 20:27:15.459209 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b06ca702-ab58-4297-ab66-f8fbc71358e5-bundle\") pod \"b06ca702-ab58-4297-ab66-f8fbc71358e5\" (UID: \"b06ca702-ab58-4297-ab66-f8fbc71358e5\") " Dec 05 20:27:15 crc kubenswrapper[4744]: I1205 20:27:15.460084 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b06ca702-ab58-4297-ab66-f8fbc71358e5-bundle" (OuterVolumeSpecName: "bundle") pod "b06ca702-ab58-4297-ab66-f8fbc71358e5" (UID: "b06ca702-ab58-4297-ab66-f8fbc71358e5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:27:15 crc kubenswrapper[4744]: I1205 20:27:15.465532 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b06ca702-ab58-4297-ab66-f8fbc71358e5-kube-api-access-5dts4" (OuterVolumeSpecName: "kube-api-access-5dts4") pod "b06ca702-ab58-4297-ab66-f8fbc71358e5" (UID: "b06ca702-ab58-4297-ab66-f8fbc71358e5"). InnerVolumeSpecName "kube-api-access-5dts4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:27:15 crc kubenswrapper[4744]: I1205 20:27:15.479688 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b06ca702-ab58-4297-ab66-f8fbc71358e5-util" (OuterVolumeSpecName: "util") pod "b06ca702-ab58-4297-ab66-f8fbc71358e5" (UID: "b06ca702-ab58-4297-ab66-f8fbc71358e5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:27:15 crc kubenswrapper[4744]: I1205 20:27:15.561104 4744 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b06ca702-ab58-4297-ab66-f8fbc71358e5-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:15 crc kubenswrapper[4744]: I1205 20:27:15.561151 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dts4\" (UniqueName: \"kubernetes.io/projected/b06ca702-ab58-4297-ab66-f8fbc71358e5-kube-api-access-5dts4\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:15 crc kubenswrapper[4744]: I1205 20:27:15.561166 4744 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b06ca702-ab58-4297-ab66-f8fbc71358e5-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:27:16 crc kubenswrapper[4744]: I1205 20:27:16.032558 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" event={"ID":"b06ca702-ab58-4297-ab66-f8fbc71358e5","Type":"ContainerDied","Data":"913f13c998621ab81ae76c8716c1fa35fb67de1078d7b2a91a19b1d0dcaa6b17"} Dec 05 20:27:16 crc kubenswrapper[4744]: I1205 20:27:16.032916 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="913f13c998621ab81ae76c8716c1fa35fb67de1078d7b2a91a19b1d0dcaa6b17" Dec 05 20:27:16 crc kubenswrapper[4744]: I1205 20:27:16.032658 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6" Dec 05 20:27:19 crc kubenswrapper[4744]: I1205 20:27:19.807074 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:27:19 crc kubenswrapper[4744]: I1205 20:27:19.807470 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:27:23 crc kubenswrapper[4744]: I1205 20:27:23.230011 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl"] Dec 05 20:27:23 crc kubenswrapper[4744]: E1205 20:27:23.230544 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b06ca702-ab58-4297-ab66-f8fbc71358e5" containerName="extract" Dec 05 20:27:23 crc kubenswrapper[4744]: I1205 20:27:23.230557 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b06ca702-ab58-4297-ab66-f8fbc71358e5" containerName="extract" Dec 05 20:27:23 crc kubenswrapper[4744]: E1205 20:27:23.230575 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b06ca702-ab58-4297-ab66-f8fbc71358e5" containerName="pull" Dec 05 20:27:23 crc kubenswrapper[4744]: I1205 20:27:23.230581 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b06ca702-ab58-4297-ab66-f8fbc71358e5" containerName="pull" Dec 05 20:27:23 crc kubenswrapper[4744]: E1205 20:27:23.230596 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b06ca702-ab58-4297-ab66-f8fbc71358e5" containerName="util" Dec 05 20:27:23 crc kubenswrapper[4744]: I1205 20:27:23.230605 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b06ca702-ab58-4297-ab66-f8fbc71358e5" containerName="util" Dec 05 20:27:23 crc kubenswrapper[4744]: I1205 20:27:23.230719 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b06ca702-ab58-4297-ab66-f8fbc71358e5" containerName="extract" Dec 05 20:27:23 crc kubenswrapper[4744]: I1205 20:27:23.231133 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl" Dec 05 20:27:23 crc kubenswrapper[4744]: I1205 20:27:23.233859 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-nz68s" Dec 05 20:27:23 crc kubenswrapper[4744]: I1205 20:27:23.270826 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl"] Dec 05 20:27:23 crc kubenswrapper[4744]: I1205 20:27:23.281034 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78xvl\" (UniqueName: \"kubernetes.io/projected/188e3fd8-70e3-485f-8c79-3f47c9a88474-kube-api-access-78xvl\") pod \"openstack-operator-controller-operator-56699b584c-kpnbl\" (UID: \"188e3fd8-70e3-485f-8c79-3f47c9a88474\") " pod="openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl" Dec 05 20:27:23 crc kubenswrapper[4744]: I1205 20:27:23.382012 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78xvl\" (UniqueName: \"kubernetes.io/projected/188e3fd8-70e3-485f-8c79-3f47c9a88474-kube-api-access-78xvl\") pod \"openstack-operator-controller-operator-56699b584c-kpnbl\" (UID: \"188e3fd8-70e3-485f-8c79-3f47c9a88474\") " pod="openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl" Dec 05 20:27:23 crc kubenswrapper[4744]: I1205 20:27:23.409332 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78xvl\" (UniqueName: \"kubernetes.io/projected/188e3fd8-70e3-485f-8c79-3f47c9a88474-kube-api-access-78xvl\") pod \"openstack-operator-controller-operator-56699b584c-kpnbl\" (UID: \"188e3fd8-70e3-485f-8c79-3f47c9a88474\") " pod="openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl" Dec 05 20:27:23 crc kubenswrapper[4744]: I1205 20:27:23.548053 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl" Dec 05 20:27:23 crc kubenswrapper[4744]: I1205 20:27:23.770745 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl"] Dec 05 20:27:24 crc kubenswrapper[4744]: I1205 20:27:24.094530 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl" event={"ID":"188e3fd8-70e3-485f-8c79-3f47c9a88474","Type":"ContainerStarted","Data":"a9db3de2b622b2fad6590028a92349372d7fef93cc00b2f8acdb1c9cdf89f903"} Dec 05 20:27:29 crc kubenswrapper[4744]: I1205 20:27:29.142588 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl" event={"ID":"188e3fd8-70e3-485f-8c79-3f47c9a88474","Type":"ContainerStarted","Data":"7c262d47f923b5aacf87c3581e82912b3ee5eb445d12455638f6233a78ab5e57"} Dec 05 20:27:29 crc kubenswrapper[4744]: I1205 20:27:29.143738 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl" Dec 05 20:27:29 crc kubenswrapper[4744]: I1205 20:27:29.189098 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl" podStartSLOduration=1.8819879309999998 podStartE2EDuration="6.189076913s" podCreationTimestamp="2025-12-05 20:27:23 +0000 UTC" firstStartedPulling="2025-12-05 20:27:23.77879696 +0000 UTC m=+1014.008608338" lastFinishedPulling="2025-12-05 20:27:28.085885952 +0000 UTC m=+1018.315697320" observedRunningTime="2025-12-05 20:27:29.185473503 +0000 UTC m=+1019.415284911" watchObservedRunningTime="2025-12-05 20:27:29.189076913 +0000 UTC m=+1019.418888281" Dec 05 20:27:33 crc kubenswrapper[4744]: I1205 20:27:33.552817 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl" Dec 05 20:27:49 crc kubenswrapper[4744]: I1205 20:27:49.806617 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:27:49 crc kubenswrapper[4744]: I1205 20:27:49.808428 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.155659 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rw45s"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.156955 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rw45s" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.158976 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-9tsms" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.162581 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rw45s"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.190709 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-b4sx5"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.191902 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-b4sx5" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.194668 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-tr2wz" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.205005 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-kj76m"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.205947 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kj76m" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.209714 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-gktr8" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.217856 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-8pp8k"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.218978 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8pp8k" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.223784 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-g8m8s" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.230770 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-b4sx5"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.253365 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-kj76m"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.274395 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-8pp8k"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.290216 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.291498 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.293875 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-lfrp9" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.295042 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xfvt\" (UniqueName: \"kubernetes.io/projected/3c35e949-b9a5-4f22-a2db-7a2f27b4bb8f-kube-api-access-6xfvt\") pod \"barbican-operator-controller-manager-7d9dfd778-rw45s\" (UID: \"3c35e949-b9a5-4f22-a2db-7a2f27b4bb8f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rw45s" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.295097 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95mjd\" (UniqueName: \"kubernetes.io/projected/b82be17d-c46f-4d8d-9264-d51d1b2ef12e-kube-api-access-95mjd\") pod \"cinder-operator-controller-manager-859b6ccc6-kj76m\" (UID: \"b82be17d-c46f-4d8d-9264-d51d1b2ef12e\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kj76m" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.295137 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w98xs\" (UniqueName: \"kubernetes.io/projected/15e7fd30-4c0d-45f6-8905-ab235fc32e16-kube-api-access-w98xs\") pod \"designate-operator-controller-manager-78b4bc895b-b4sx5\" (UID: \"15e7fd30-4c0d-45f6-8905-ab235fc32e16\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-b4sx5" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.295218 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9rsn\" (UniqueName: \"kubernetes.io/projected/e038d15c-67e9-4551-b13b-c541b4b76827-kube-api-access-q9rsn\") pod \"glance-operator-controller-manager-77987cd8cd-8pp8k\" (UID: \"e038d15c-67e9-4551-b13b-c541b4b76827\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8pp8k" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.299710 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mmzcc"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.300748 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mmzcc" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.307465 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.307674 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-d4jpr" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.313281 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.314328 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.316100 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-fl86m" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.319456 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mmzcc"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.323853 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.331128 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-4dm28"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.332475 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4dm28" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.336559 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-mclfv" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.337065 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.343369 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-4dm28"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.372551 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-57gjb"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.374073 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-57gjb" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.377906 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-nlzcr" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.378586 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-57gjb"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.395944 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9rsn\" (UniqueName: \"kubernetes.io/projected/e038d15c-67e9-4551-b13b-c541b4b76827-kube-api-access-q9rsn\") pod \"glance-operator-controller-manager-77987cd8cd-8pp8k\" (UID: \"e038d15c-67e9-4551-b13b-c541b4b76827\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8pp8k" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.395995 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vb2g\" (UniqueName: \"kubernetes.io/projected/8656f7db-cb1e-40fa-ba97-93a647f869ac-kube-api-access-2vb2g\") pod \"ironic-operator-controller-manager-6c548fd776-4dm28\" (UID: \"8656f7db-cb1e-40fa-ba97-93a647f869ac\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4dm28" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.396026 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert\") pod \"infra-operator-controller-manager-57548d458d-5bfxc\" (UID: \"341519c2-107a-440a-bfbb-af937e0c681f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.396047 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbdjn\" (UniqueName: \"kubernetes.io/projected/a4ad9153-5de0-4bb5-a419-fe70e3099450-kube-api-access-pbdjn\") pod \"heat-operator-controller-manager-5f64f6f8bb-n4ljw\" (UID: \"a4ad9153-5de0-4bb5-a419-fe70e3099450\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.396078 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42hqj\" (UniqueName: \"kubernetes.io/projected/341519c2-107a-440a-bfbb-af937e0c681f-kube-api-access-42hqj\") pod \"infra-operator-controller-manager-57548d458d-5bfxc\" (UID: \"341519c2-107a-440a-bfbb-af937e0c681f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.396100 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cms94\" (UniqueName: \"kubernetes.io/projected/6c19a4e0-7d3f-44b2-9e16-8e7cdc24ab74-kube-api-access-cms94\") pod \"horizon-operator-controller-manager-68c6d99b8f-mmzcc\" (UID: \"6c19a4e0-7d3f-44b2-9e16-8e7cdc24ab74\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mmzcc" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.396122 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xfvt\" (UniqueName: \"kubernetes.io/projected/3c35e949-b9a5-4f22-a2db-7a2f27b4bb8f-kube-api-access-6xfvt\") pod \"barbican-operator-controller-manager-7d9dfd778-rw45s\" (UID: \"3c35e949-b9a5-4f22-a2db-7a2f27b4bb8f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rw45s" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.396151 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95mjd\" (UniqueName: \"kubernetes.io/projected/b82be17d-c46f-4d8d-9264-d51d1b2ef12e-kube-api-access-95mjd\") pod \"cinder-operator-controller-manager-859b6ccc6-kj76m\" (UID: \"b82be17d-c46f-4d8d-9264-d51d1b2ef12e\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kj76m" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.396177 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w98xs\" (UniqueName: \"kubernetes.io/projected/15e7fd30-4c0d-45f6-8905-ab235fc32e16-kube-api-access-w98xs\") pod \"designate-operator-controller-manager-78b4bc895b-b4sx5\" (UID: \"15e7fd30-4c0d-45f6-8905-ab235fc32e16\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-b4sx5" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.411801 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-879df"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.413494 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-879df" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.423041 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-bmwmn" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.433585 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xfvt\" (UniqueName: \"kubernetes.io/projected/3c35e949-b9a5-4f22-a2db-7a2f27b4bb8f-kube-api-access-6xfvt\") pod \"barbican-operator-controller-manager-7d9dfd778-rw45s\" (UID: \"3c35e949-b9a5-4f22-a2db-7a2f27b4bb8f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rw45s" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.448956 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95mjd\" (UniqueName: \"kubernetes.io/projected/b82be17d-c46f-4d8d-9264-d51d1b2ef12e-kube-api-access-95mjd\") pod \"cinder-operator-controller-manager-859b6ccc6-kj76m\" (UID: \"b82be17d-c46f-4d8d-9264-d51d1b2ef12e\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kj76m" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.467080 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9rsn\" (UniqueName: \"kubernetes.io/projected/e038d15c-67e9-4551-b13b-c541b4b76827-kube-api-access-q9rsn\") pod \"glance-operator-controller-manager-77987cd8cd-8pp8k\" (UID: \"e038d15c-67e9-4551-b13b-c541b4b76827\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8pp8k" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.470631 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-879df"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.488082 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w98xs\" (UniqueName: \"kubernetes.io/projected/15e7fd30-4c0d-45f6-8905-ab235fc32e16-kube-api-access-w98xs\") pod \"designate-operator-controller-manager-78b4bc895b-b4sx5\" (UID: \"15e7fd30-4c0d-45f6-8905-ab235fc32e16\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-b4sx5" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.488440 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rw45s" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.496957 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42hqj\" (UniqueName: \"kubernetes.io/projected/341519c2-107a-440a-bfbb-af937e0c681f-kube-api-access-42hqj\") pod \"infra-operator-controller-manager-57548d458d-5bfxc\" (UID: \"341519c2-107a-440a-bfbb-af937e0c681f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.496998 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cms94\" (UniqueName: \"kubernetes.io/projected/6c19a4e0-7d3f-44b2-9e16-8e7cdc24ab74-kube-api-access-cms94\") pod \"horizon-operator-controller-manager-68c6d99b8f-mmzcc\" (UID: \"6c19a4e0-7d3f-44b2-9e16-8e7cdc24ab74\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mmzcc" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.497024 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h2dg\" (UniqueName: \"kubernetes.io/projected/022c2e13-58dd-42d3-a3a4-91a3eb74e0b5-kube-api-access-5h2dg\") pod \"manila-operator-controller-manager-7c79b5df47-879df\" (UID: \"022c2e13-58dd-42d3-a3a4-91a3eb74e0b5\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-879df" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.497050 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spx26\" (UniqueName: \"kubernetes.io/projected/021ea569-b351-4d31-8080-75f5ec005daa-kube-api-access-spx26\") pod \"keystone-operator-controller-manager-7765d96ddf-57gjb\" (UID: \"021ea569-b351-4d31-8080-75f5ec005daa\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-57gjb" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.497108 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vb2g\" (UniqueName: \"kubernetes.io/projected/8656f7db-cb1e-40fa-ba97-93a647f869ac-kube-api-access-2vb2g\") pod \"ironic-operator-controller-manager-6c548fd776-4dm28\" (UID: \"8656f7db-cb1e-40fa-ba97-93a647f869ac\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4dm28" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.497132 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert\") pod \"infra-operator-controller-manager-57548d458d-5bfxc\" (UID: \"341519c2-107a-440a-bfbb-af937e0c681f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.497149 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbdjn\" (UniqueName: \"kubernetes.io/projected/a4ad9153-5de0-4bb5-a419-fe70e3099450-kube-api-access-pbdjn\") pod \"heat-operator-controller-manager-5f64f6f8bb-n4ljw\" (UID: \"a4ad9153-5de0-4bb5-a419-fe70e3099450\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw" Dec 05 20:27:52 crc kubenswrapper[4744]: E1205 20:27:52.499103 4744 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:27:52 crc kubenswrapper[4744]: E1205 20:27:52.499148 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert podName:341519c2-107a-440a-bfbb-af937e0c681f nodeName:}" failed. No retries permitted until 2025-12-05 20:27:52.999133467 +0000 UTC m=+1043.228944835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert") pod "infra-operator-controller-manager-57548d458d-5bfxc" (UID: "341519c2-107a-440a-bfbb-af937e0c681f") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.507864 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m8844"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.509135 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m8844" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.520475 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dffqv"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.529676 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dffqv"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.529846 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dffqv" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.530854 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m8844"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.533154 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-b4sx5" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.540666 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-6fxld" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.540754 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kj76m" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.549063 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-9pr6k" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.572439 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8pp8k" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.621757 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h2dg\" (UniqueName: \"kubernetes.io/projected/022c2e13-58dd-42d3-a3a4-91a3eb74e0b5-kube-api-access-5h2dg\") pod \"manila-operator-controller-manager-7c79b5df47-879df\" (UID: \"022c2e13-58dd-42d3-a3a4-91a3eb74e0b5\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-879df" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.621807 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnxkj\" (UniqueName: \"kubernetes.io/projected/69215470-ec91-4d88-99f1-99117a543086-kube-api-access-pnxkj\") pod \"mariadb-operator-controller-manager-56bbcc9d85-m8844\" (UID: \"69215470-ec91-4d88-99f1-99117a543086\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m8844" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.621837 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spx26\" (UniqueName: \"kubernetes.io/projected/021ea569-b351-4d31-8080-75f5ec005daa-kube-api-access-spx26\") pod \"keystone-operator-controller-manager-7765d96ddf-57gjb\" (UID: \"021ea569-b351-4d31-8080-75f5ec005daa\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-57gjb" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.621883 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq7z2\" (UniqueName: \"kubernetes.io/projected/0d2bbe6b-5adb-402a-8ef6-d7be819d5b73-kube-api-access-xq7z2\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-dffqv\" (UID: \"0d2bbe6b-5adb-402a-8ef6-d7be819d5b73\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dffqv" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.623978 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbdjn\" (UniqueName: \"kubernetes.io/projected/a4ad9153-5de0-4bb5-a419-fe70e3099450-kube-api-access-pbdjn\") pod \"heat-operator-controller-manager-5f64f6f8bb-n4ljw\" (UID: \"a4ad9153-5de0-4bb5-a419-fe70e3099450\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.626420 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.628635 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-kdhmb"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.638525 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cms94\" (UniqueName: \"kubernetes.io/projected/6c19a4e0-7d3f-44b2-9e16-8e7cdc24ab74-kube-api-access-cms94\") pod \"horizon-operator-controller-manager-68c6d99b8f-mmzcc\" (UID: \"6c19a4e0-7d3f-44b2-9e16-8e7cdc24ab74\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mmzcc" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.642246 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mmzcc" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.662657 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.662993 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-kdhmb" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.671896 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vb2g\" (UniqueName: \"kubernetes.io/projected/8656f7db-cb1e-40fa-ba97-93a647f869ac-kube-api-access-2vb2g\") pod \"ironic-operator-controller-manager-6c548fd776-4dm28\" (UID: \"8656f7db-cb1e-40fa-ba97-93a647f869ac\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4dm28" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.673060 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42hqj\" (UniqueName: \"kubernetes.io/projected/341519c2-107a-440a-bfbb-af937e0c681f-kube-api-access-42hqj\") pod \"infra-operator-controller-manager-57548d458d-5bfxc\" (UID: \"341519c2-107a-440a-bfbb-af937e0c681f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.678416 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-cjhrn" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.736756 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h2dg\" (UniqueName: \"kubernetes.io/projected/022c2e13-58dd-42d3-a3a4-91a3eb74e0b5-kube-api-access-5h2dg\") pod \"manila-operator-controller-manager-7c79b5df47-879df\" (UID: \"022c2e13-58dd-42d3-a3a4-91a3eb74e0b5\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-879df" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.762096 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spx26\" (UniqueName: \"kubernetes.io/projected/021ea569-b351-4d31-8080-75f5ec005daa-kube-api-access-spx26\") pod \"keystone-operator-controller-manager-7765d96ddf-57gjb\" (UID: \"021ea569-b351-4d31-8080-75f5ec005daa\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-57gjb" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.765033 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.766178 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnxkj\" (UniqueName: \"kubernetes.io/projected/69215470-ec91-4d88-99f1-99117a543086-kube-api-access-pnxkj\") pod \"mariadb-operator-controller-manager-56bbcc9d85-m8844\" (UID: \"69215470-ec91-4d88-99f1-99117a543086\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m8844" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.766233 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq7z2\" (UniqueName: \"kubernetes.io/projected/0d2bbe6b-5adb-402a-8ef6-d7be819d5b73-kube-api-access-xq7z2\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-dffqv\" (UID: \"0d2bbe6b-5adb-402a-8ef6-d7be819d5b73\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dffqv" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.781772 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-6qjvb" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.819672 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-kdhmb"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.820856 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq7z2\" (UniqueName: \"kubernetes.io/projected/0d2bbe6b-5adb-402a-8ef6-d7be819d5b73-kube-api-access-xq7z2\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-dffqv\" (UID: \"0d2bbe6b-5adb-402a-8ef6-d7be819d5b73\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dffqv" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.831609 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.849225 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-879df" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.867454 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb2hp\" (UniqueName: \"kubernetes.io/projected/f3194afc-f21e-4fb0-bc31-5ac4b1b6e434-kube-api-access-rb2hp\") pod \"nova-operator-controller-manager-697bc559fc-6pdfn\" (UID: \"f3194afc-f21e-4fb0-bc31-5ac4b1b6e434\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.867532 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gnc9\" (UniqueName: \"kubernetes.io/projected/2c3d0695-b544-47a5-ad85-36f8fd2f1dcb-kube-api-access-7gnc9\") pod \"octavia-operator-controller-manager-998648c74-kdhmb\" (UID: \"2c3d0695-b544-47a5-ad85-36f8fd2f1dcb\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-kdhmb" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.868104 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnxkj\" (UniqueName: \"kubernetes.io/projected/69215470-ec91-4d88-99f1-99117a543086-kube-api-access-pnxkj\") pod \"mariadb-operator-controller-manager-56bbcc9d85-m8844\" (UID: \"69215470-ec91-4d88-99f1-99117a543086\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m8844" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.963624 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4dm28" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.970470 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gnc9\" (UniqueName: \"kubernetes.io/projected/2c3d0695-b544-47a5-ad85-36f8fd2f1dcb-kube-api-access-7gnc9\") pod \"octavia-operator-controller-manager-998648c74-kdhmb\" (UID: \"2c3d0695-b544-47a5-ad85-36f8fd2f1dcb\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-kdhmb" Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.970517 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8"] Dec 05 20:27:52 crc kubenswrapper[4744]: I1205 20:27:52.970595 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb2hp\" (UniqueName: \"kubernetes.io/projected/f3194afc-f21e-4fb0-bc31-5ac4b1b6e434-kube-api-access-rb2hp\") pod \"nova-operator-controller-manager-697bc559fc-6pdfn\" (UID: \"f3194afc-f21e-4fb0-bc31-5ac4b1b6e434\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:52.972863 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:52.973715 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m8844" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:52.985819 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dffqv" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.003390 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gnc9\" (UniqueName: \"kubernetes.io/projected/2c3d0695-b544-47a5-ad85-36f8fd2f1dcb-kube-api-access-7gnc9\") pod \"octavia-operator-controller-manager-998648c74-kdhmb\" (UID: \"2c3d0695-b544-47a5-ad85-36f8fd2f1dcb\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-kdhmb" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.003532 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vmddm" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.004502 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-57gjb" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.022453 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m"] Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.027734 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8"] Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.027859 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.036955 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m"] Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.058909 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4b9p2"] Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.059755 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.059857 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-nws7g" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.060370 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4b9p2" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.076124 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert\") pod \"infra-operator-controller-manager-57548d458d-5bfxc\" (UID: \"341519c2-107a-440a-bfbb-af937e0c681f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:27:53 crc kubenswrapper[4744]: E1205 20:27:53.076250 4744 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:27:53 crc kubenswrapper[4744]: E1205 20:27:53.076467 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert podName:341519c2-107a-440a-bfbb-af937e0c681f nodeName:}" failed. No retries permitted until 2025-12-05 20:27:54.076452804 +0000 UTC m=+1044.306264172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert") pod "infra-operator-controller-manager-57548d458d-5bfxc" (UID: "341519c2-107a-440a-bfbb-af937e0c681f") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.083997 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-kvl9t" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.085916 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb2hp\" (UniqueName: \"kubernetes.io/projected/f3194afc-f21e-4fb0-bc31-5ac4b1b6e434-kube-api-access-rb2hp\") pod \"nova-operator-controller-manager-697bc559fc-6pdfn\" (UID: \"f3194afc-f21e-4fb0-bc31-5ac4b1b6e434\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.085978 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4b9p2"] Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.100373 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-9t8ch"] Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.101900 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-9t8ch" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.105168 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-v879l"] Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.106249 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-v879l" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.111603 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-9t8ch"] Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.145846 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-kdhmb" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.146577 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-ltsf2" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.146773 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-nxms4" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.154061 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-v879l"] Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.165705 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.206188 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx"] Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.208224 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.220615 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-mh7xw" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.222073 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zxrr\" (UniqueName: \"kubernetes.io/projected/588bb9d2-d747-43cb-8e9f-73d1961bebf1-kube-api-access-8zxrr\") pod \"ovn-operator-controller-manager-b6456fdb6-qqjj8\" (UID: \"588bb9d2-d747-43cb-8e9f-73d1961bebf1\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.222109 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m\" (UID: \"43979be1-9cc5-445f-b079-b4504355cce4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.222129 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd274\" (UniqueName: \"kubernetes.io/projected/807989ca-0470-47bc-8bef-9c1dd35e4bb0-kube-api-access-xd274\") pod \"placement-operator-controller-manager-78f8948974-4b9p2\" (UID: \"807989ca-0470-47bc-8bef-9c1dd35e4bb0\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4b9p2" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.222174 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r2k2\" (UniqueName: \"kubernetes.io/projected/43979be1-9cc5-445f-b079-b4504355cce4-kube-api-access-5r2k2\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m\" (UID: \"43979be1-9cc5-445f-b079-b4504355cce4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.256914 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx"] Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.268431 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp"] Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.270158 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.279517 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp"] Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.356166 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5ljb9" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.357341 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m\" (UID: \"43979be1-9cc5-445f-b079-b4504355cce4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.357370 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd274\" (UniqueName: \"kubernetes.io/projected/807989ca-0470-47bc-8bef-9c1dd35e4bb0-kube-api-access-xd274\") pod \"placement-operator-controller-manager-78f8948974-4b9p2\" (UID: \"807989ca-0470-47bc-8bef-9c1dd35e4bb0\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4b9p2" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.357411 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4j7r\" (UniqueName: \"kubernetes.io/projected/4aae801a-e589-469a-b153-116744edc63b-kube-api-access-w4j7r\") pod \"swift-operator-controller-manager-5f8c65bbfc-9t8ch\" (UID: \"4aae801a-e589-469a-b153-116744edc63b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-9t8ch" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.357435 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r2k2\" (UniqueName: \"kubernetes.io/projected/43979be1-9cc5-445f-b079-b4504355cce4-kube-api-access-5r2k2\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m\" (UID: \"43979be1-9cc5-445f-b079-b4504355cce4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.357464 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbs2b\" (UniqueName: \"kubernetes.io/projected/5e4a6d16-0c89-4bd1-aa53-ce798baff113-kube-api-access-wbs2b\") pod \"telemetry-operator-controller-manager-76cc84c6bb-v879l\" (UID: \"5e4a6d16-0c89-4bd1-aa53-ce798baff113\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-v879l" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.357501 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2b5x\" (UniqueName: \"kubernetes.io/projected/564e07f4-0673-42d8-a7ee-68366706b2d4-kube-api-access-k2b5x\") pod \"test-operator-controller-manager-5854674fcc-lbxdx\" (UID: \"564e07f4-0673-42d8-a7ee-68366706b2d4\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.357519 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6phz\" (UniqueName: \"kubernetes.io/projected/9f0e37bb-46f6-45de-a562-ee1ce4d89c74-kube-api-access-p6phz\") pod \"watcher-operator-controller-manager-7f6cb9b975-hx6rp\" (UID: \"9f0e37bb-46f6-45de-a562-ee1ce4d89c74\") " pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.357572 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zxrr\" (UniqueName: \"kubernetes.io/projected/588bb9d2-d747-43cb-8e9f-73d1961bebf1-kube-api-access-8zxrr\") pod \"ovn-operator-controller-manager-b6456fdb6-qqjj8\" (UID: \"588bb9d2-d747-43cb-8e9f-73d1961bebf1\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8" Dec 05 20:27:53 crc kubenswrapper[4744]: E1205 20:27:53.358327 4744 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:27:53 crc kubenswrapper[4744]: E1205 20:27:53.358365 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert podName:43979be1-9cc5-445f-b079-b4504355cce4 nodeName:}" failed. No retries permitted until 2025-12-05 20:27:53.858352912 +0000 UTC m=+1044.088164280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" (UID: "43979be1-9cc5-445f-b079-b4504355cce4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.458329 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4j7r\" (UniqueName: \"kubernetes.io/projected/4aae801a-e589-469a-b153-116744edc63b-kube-api-access-w4j7r\") pod \"swift-operator-controller-manager-5f8c65bbfc-9t8ch\" (UID: \"4aae801a-e589-469a-b153-116744edc63b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-9t8ch" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.458384 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbs2b\" (UniqueName: \"kubernetes.io/projected/5e4a6d16-0c89-4bd1-aa53-ce798baff113-kube-api-access-wbs2b\") pod \"telemetry-operator-controller-manager-76cc84c6bb-v879l\" (UID: \"5e4a6d16-0c89-4bd1-aa53-ce798baff113\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-v879l" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.458420 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2b5x\" (UniqueName: \"kubernetes.io/projected/564e07f4-0673-42d8-a7ee-68366706b2d4-kube-api-access-k2b5x\") pod \"test-operator-controller-manager-5854674fcc-lbxdx\" (UID: \"564e07f4-0673-42d8-a7ee-68366706b2d4\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.458443 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6phz\" (UniqueName: \"kubernetes.io/projected/9f0e37bb-46f6-45de-a562-ee1ce4d89c74-kube-api-access-p6phz\") pod \"watcher-operator-controller-manager-7f6cb9b975-hx6rp\" (UID: \"9f0e37bb-46f6-45de-a562-ee1ce4d89c74\") " pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.481838 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zxrr\" (UniqueName: \"kubernetes.io/projected/588bb9d2-d747-43cb-8e9f-73d1961bebf1-kube-api-access-8zxrr\") pod \"ovn-operator-controller-manager-b6456fdb6-qqjj8\" (UID: \"588bb9d2-d747-43cb-8e9f-73d1961bebf1\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.536992 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd274\" (UniqueName: \"kubernetes.io/projected/807989ca-0470-47bc-8bef-9c1dd35e4bb0-kube-api-access-xd274\") pod \"placement-operator-controller-manager-78f8948974-4b9p2\" (UID: \"807989ca-0470-47bc-8bef-9c1dd35e4bb0\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4b9p2" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.538030 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbs2b\" (UniqueName: \"kubernetes.io/projected/5e4a6d16-0c89-4bd1-aa53-ce798baff113-kube-api-access-wbs2b\") pod \"telemetry-operator-controller-manager-76cc84c6bb-v879l\" (UID: \"5e4a6d16-0c89-4bd1-aa53-ce798baff113\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-v879l" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.543531 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2b5x\" (UniqueName: \"kubernetes.io/projected/564e07f4-0673-42d8-a7ee-68366706b2d4-kube-api-access-k2b5x\") pod \"test-operator-controller-manager-5854674fcc-lbxdx\" (UID: \"564e07f4-0673-42d8-a7ee-68366706b2d4\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.549109 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k"] Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.551130 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4j7r\" (UniqueName: \"kubernetes.io/projected/4aae801a-e589-469a-b153-116744edc63b-kube-api-access-w4j7r\") pod \"swift-operator-controller-manager-5f8c65bbfc-9t8ch\" (UID: \"4aae801a-e589-469a-b153-116744edc63b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-9t8ch" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.551456 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.556044 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.556212 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hs5hm" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.556638 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6phz\" (UniqueName: \"kubernetes.io/projected/9f0e37bb-46f6-45de-a562-ee1ce4d89c74-kube-api-access-p6phz\") pod \"watcher-operator-controller-manager-7f6cb9b975-hx6rp\" (UID: \"9f0e37bb-46f6-45de-a562-ee1ce4d89c74\") " pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.559563 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzq8j\" (UniqueName: \"kubernetes.io/projected/63391739-cc08-49ea-be59-2c0740078450-kube-api-access-wzq8j\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.559636 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.559685 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.559813 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.565456 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k"] Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.583634 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-9t8ch" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.585446 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r2k2\" (UniqueName: \"kubernetes.io/projected/43979be1-9cc5-445f-b079-b4504355cce4-kube-api-access-5r2k2\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m\" (UID: \"43979be1-9cc5-445f-b079-b4504355cce4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.596220 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-v879l" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.611528 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.630162 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.660763 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.660991 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzq8j\" (UniqueName: \"kubernetes.io/projected/63391739-cc08-49ea-be59-2c0740078450-kube-api-access-wzq8j\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.661091 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:27:53 crc kubenswrapper[4744]: E1205 20:27:53.661266 4744 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:27:53 crc kubenswrapper[4744]: E1205 20:27:53.661377 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs podName:63391739-cc08-49ea-be59-2c0740078450 nodeName:}" failed. No retries permitted until 2025-12-05 20:27:54.16136038 +0000 UTC m=+1044.391171748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs") pod "openstack-operator-controller-manager-7dc867b75-npt5k" (UID: "63391739-cc08-49ea-be59-2c0740078450") : secret "webhook-server-cert" not found Dec 05 20:27:53 crc kubenswrapper[4744]: E1205 20:27:53.662168 4744 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:27:53 crc kubenswrapper[4744]: E1205 20:27:53.662268 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs podName:63391739-cc08-49ea-be59-2c0740078450 nodeName:}" failed. No retries permitted until 2025-12-05 20:27:54.162258489 +0000 UTC m=+1044.392069857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs") pod "openstack-operator-controller-manager-7dc867b75-npt5k" (UID: "63391739-cc08-49ea-be59-2c0740078450") : secret "metrics-server-cert" not found Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.680580 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.785644 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzq8j\" (UniqueName: \"kubernetes.io/projected/63391739-cc08-49ea-be59-2c0740078450-kube-api-access-wzq8j\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.787468 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4b9p2" Dec 05 20:27:53 crc kubenswrapper[4744]: I1205 20:27:53.815655 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lh9hc"] Dec 05 20:27:54 crc kubenswrapper[4744]: I1205 20:27:54.310503 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lh9hc" Dec 05 20:27:54 crc kubenswrapper[4744]: I1205 20:27:54.316647 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-xwtcm" Dec 05 20:27:54 crc kubenswrapper[4744]: I1205 20:27:54.324723 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:27:54 crc kubenswrapper[4744]: I1205 20:27:54.324786 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert\") pod \"infra-operator-controller-manager-57548d458d-5bfxc\" (UID: \"341519c2-107a-440a-bfbb-af937e0c681f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:27:54 crc kubenswrapper[4744]: I1205 20:27:54.324823 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:27:54 crc kubenswrapper[4744]: I1205 20:27:54.324869 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m\" (UID: \"43979be1-9cc5-445f-b079-b4504355cce4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:27:54 crc kubenswrapper[4744]: E1205 20:27:54.325039 4744 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:27:54 crc kubenswrapper[4744]: E1205 20:27:54.325090 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert podName:43979be1-9cc5-445f-b079-b4504355cce4 nodeName:}" failed. No retries permitted until 2025-12-05 20:27:55.325072038 +0000 UTC m=+1045.554883406 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" (UID: "43979be1-9cc5-445f-b079-b4504355cce4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:27:54 crc kubenswrapper[4744]: E1205 20:27:54.325853 4744 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:27:54 crc kubenswrapper[4744]: E1205 20:27:54.325885 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs podName:63391739-cc08-49ea-be59-2c0740078450 nodeName:}" failed. No retries permitted until 2025-12-05 20:27:55.325874786 +0000 UTC m=+1045.555686154 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs") pod "openstack-operator-controller-manager-7dc867b75-npt5k" (UID: "63391739-cc08-49ea-be59-2c0740078450") : secret "webhook-server-cert" not found Dec 05 20:27:54 crc kubenswrapper[4744]: E1205 20:27:54.325930 4744 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:27:54 crc kubenswrapper[4744]: E1205 20:27:54.325951 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert podName:341519c2-107a-440a-bfbb-af937e0c681f nodeName:}" failed. No retries permitted until 2025-12-05 20:27:56.325943397 +0000 UTC m=+1046.555754765 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert") pod "infra-operator-controller-manager-57548d458d-5bfxc" (UID: "341519c2-107a-440a-bfbb-af937e0c681f") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:27:54 crc kubenswrapper[4744]: E1205 20:27:54.325987 4744 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:27:54 crc kubenswrapper[4744]: E1205 20:27:54.326008 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs podName:63391739-cc08-49ea-be59-2c0740078450 nodeName:}" failed. No retries permitted until 2025-12-05 20:27:55.326001208 +0000 UTC m=+1045.555812576 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs") pod "openstack-operator-controller-manager-7dc867b75-npt5k" (UID: "63391739-cc08-49ea-be59-2c0740078450") : secret "metrics-server-cert" not found Dec 05 20:27:54 crc kubenswrapper[4744]: I1205 20:27:54.329809 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lh9hc"] Dec 05 20:27:54 crc kubenswrapper[4744]: I1205 20:27:54.385884 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-kj76m"] Dec 05 20:27:54 crc kubenswrapper[4744]: I1205 20:27:54.426648 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn47l\" (UniqueName: \"kubernetes.io/projected/41b48b2f-7b8d-46df-a226-6c163e4f57b0-kube-api-access-qn47l\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lh9hc\" (UID: \"41b48b2f-7b8d-46df-a226-6c163e4f57b0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lh9hc" Dec 05 20:27:54 crc kubenswrapper[4744]: I1205 20:27:54.527854 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn47l\" (UniqueName: \"kubernetes.io/projected/41b48b2f-7b8d-46df-a226-6c163e4f57b0-kube-api-access-qn47l\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lh9hc\" (UID: \"41b48b2f-7b8d-46df-a226-6c163e4f57b0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lh9hc" Dec 05 20:27:54 crc kubenswrapper[4744]: I1205 20:27:54.554411 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:27:54 crc kubenswrapper[4744]: I1205 20:27:54.565334 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn47l\" (UniqueName: \"kubernetes.io/projected/41b48b2f-7b8d-46df-a226-6c163e4f57b0-kube-api-access-qn47l\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lh9hc\" (UID: \"41b48b2f-7b8d-46df-a226-6c163e4f57b0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lh9hc" Dec 05 20:27:54 crc kubenswrapper[4744]: I1205 20:27:54.789678 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lh9hc" Dec 05 20:27:55 crc kubenswrapper[4744]: I1205 20:27:55.334953 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-8pp8k"] Dec 05 20:27:55 crc kubenswrapper[4744]: I1205 20:27:55.359312 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:27:55 crc kubenswrapper[4744]: I1205 20:27:55.359371 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:27:55 crc kubenswrapper[4744]: I1205 20:27:55.359408 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m\" (UID: \"43979be1-9cc5-445f-b079-b4504355cce4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:27:55 crc kubenswrapper[4744]: E1205 20:27:55.359554 4744 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:27:55 crc kubenswrapper[4744]: E1205 20:27:55.359605 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert podName:43979be1-9cc5-445f-b079-b4504355cce4 nodeName:}" failed. No retries permitted until 2025-12-05 20:27:57.359582392 +0000 UTC m=+1047.589401660 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" (UID: "43979be1-9cc5-445f-b079-b4504355cce4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:27:55 crc kubenswrapper[4744]: E1205 20:27:55.359886 4744 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:27:55 crc kubenswrapper[4744]: E1205 20:27:55.359912 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs podName:63391739-cc08-49ea-be59-2c0740078450 nodeName:}" failed. No retries permitted until 2025-12-05 20:27:57.359904439 +0000 UTC m=+1047.589715807 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs") pod "openstack-operator-controller-manager-7dc867b75-npt5k" (UID: "63391739-cc08-49ea-be59-2c0740078450") : secret "webhook-server-cert" not found Dec 05 20:27:55 crc kubenswrapper[4744]: E1205 20:27:55.359964 4744 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:27:55 crc kubenswrapper[4744]: E1205 20:27:55.359983 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs podName:63391739-cc08-49ea-be59-2c0740078450 nodeName:}" failed. No retries permitted until 2025-12-05 20:27:57.35997779 +0000 UTC m=+1047.589789158 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs") pod "openstack-operator-controller-manager-7dc867b75-npt5k" (UID: "63391739-cc08-49ea-be59-2c0740078450") : secret "metrics-server-cert" not found Dec 05 20:27:55 crc kubenswrapper[4744]: I1205 20:27:55.397212 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rw45s"] Dec 05 20:27:55 crc kubenswrapper[4744]: I1205 20:27:55.487132 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rw45s" event={"ID":"3c35e949-b9a5-4f22-a2db-7a2f27b4bb8f","Type":"ContainerStarted","Data":"a7bc168112cfa567d4051f1eace15b94fba6d86029fb2a02334669df82822962"} Dec 05 20:27:55 crc kubenswrapper[4744]: I1205 20:27:55.489375 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kj76m" event={"ID":"b82be17d-c46f-4d8d-9264-d51d1b2ef12e","Type":"ContainerStarted","Data":"2ca3513631a836e970d69929161229faa4eb0bc3ccf2d93b665d4a95acacc8e9"} Dec 05 20:27:55 crc kubenswrapper[4744]: I1205 20:27:55.490036 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8pp8k" event={"ID":"e038d15c-67e9-4551-b13b-c541b4b76827","Type":"ContainerStarted","Data":"d927423c59f3486cd942607240ae39f1e8fd49f83072070da178f78a7c9b1770"} Dec 05 20:27:55 crc kubenswrapper[4744]: I1205 20:27:55.928933 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mmzcc"] Dec 05 20:27:55 crc kubenswrapper[4744]: I1205 20:27:55.943767 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m8844"] Dec 05 20:27:55 crc kubenswrapper[4744]: I1205 20:27:55.954002 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-879df"] Dec 05 20:27:55 crc kubenswrapper[4744]: I1205 20:27:55.965476 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-4dm28"] Dec 05 20:27:55 crc kubenswrapper[4744]: I1205 20:27:55.995964 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4b9p2"] Dec 05 20:27:55 crc kubenswrapper[4744]: W1205 20:27:55.998724 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod022c2e13_58dd_42d3_a3a4_91a3eb74e0b5.slice/crio-46f5f45512ae63936b6ab3617ca1ae8a506cf27198c405dd6b85fcd396f1323a WatchSource:0}: Error finding container 46f5f45512ae63936b6ab3617ca1ae8a506cf27198c405dd6b85fcd396f1323a: Status 404 returned error can't find the container with id 46f5f45512ae63936b6ab3617ca1ae8a506cf27198c405dd6b85fcd396f1323a Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.024361 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dffqv"] Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.050515 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp"] Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.222518 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-9t8ch"] Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.262545 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-b4sx5"] Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.273712 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-57gjb"] Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.333169 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert\") pod \"infra-operator-controller-manager-57548d458d-5bfxc\" (UID: \"341519c2-107a-440a-bfbb-af937e0c681f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:27:56 crc kubenswrapper[4744]: E1205 20:27:56.333503 4744 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:27:56 crc kubenswrapper[4744]: E1205 20:27:56.333588 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert podName:341519c2-107a-440a-bfbb-af937e0c681f nodeName:}" failed. No retries permitted until 2025-12-05 20:28:00.333545708 +0000 UTC m=+1050.563357086 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert") pod "infra-operator-controller-manager-57548d458d-5bfxc" (UID: "341519c2-107a-440a-bfbb-af937e0c681f") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.414156 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-kdhmb"] Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.426173 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lh9hc"] Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.468874 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn"] Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.502019 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-57gjb" event={"ID":"021ea569-b351-4d31-8080-75f5ec005daa","Type":"ContainerStarted","Data":"de7aeb83f8742f032306b1e79c091e64f3e112b28da20baf903e611fea038c10"} Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.518347 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8"] Dec 05 20:27:56 crc kubenswrapper[4744]: E1205 20:27:56.520648 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rb2hp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-6pdfn_openstack-operators(f3194afc-f21e-4fb0-bc31-5ac4b1b6e434): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.521245 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lh9hc" event={"ID":"41b48b2f-7b8d-46df-a226-6c163e4f57b0","Type":"ContainerStarted","Data":"c59d6b68e79f08823bae7dfca57163fa86eadd44e3fbb06199dc9fbd740e2566"} Dec 05 20:27:56 crc kubenswrapper[4744]: E1205 20:27:56.526454 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rb2hp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-6pdfn_openstack-operators(f3194afc-f21e-4fb0-bc31-5ac4b1b6e434): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.528668 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw"] Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.528709 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4dm28" event={"ID":"8656f7db-cb1e-40fa-ba97-93a647f869ac","Type":"ContainerStarted","Data":"60ab49b30bb8014088308b1d48c7116179a5e97258088c1c68633e613ab12b45"} Dec 05 20:27:56 crc kubenswrapper[4744]: E1205 20:27:56.528706 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn" podUID="f3194afc-f21e-4fb0-bc31-5ac4b1b6e434" Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.531831 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx"] Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.536076 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4b9p2" event={"ID":"807989ca-0470-47bc-8bef-9c1dd35e4bb0","Type":"ContainerStarted","Data":"85fbc43633bb420a82a73e6e0c0afe2a99851b89e44cd4bdd45bb07b09ef9505"} Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.537032 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-v879l"] Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.540888 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-b4sx5" event={"ID":"15e7fd30-4c0d-45f6-8905-ab235fc32e16","Type":"ContainerStarted","Data":"fb28926af92de800e451afe3906b17b37e081e923ab73d5faced6fb3c3232521"} Dec 05 20:27:56 crc kubenswrapper[4744]: E1205 20:27:56.541647 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8zxrr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-qqjj8_openstack-operators(588bb9d2-d747-43cb-8e9f-73d1961bebf1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.542729 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" event={"ID":"9f0e37bb-46f6-45de-a562-ee1ce4d89c74","Type":"ContainerStarted","Data":"f89d6968db54e8e8354ea65430841272db1bb0b5c30c786f02f3f8a999061a1f"} Dec 05 20:27:56 crc kubenswrapper[4744]: E1205 20:27:56.543954 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8zxrr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-qqjj8_openstack-operators(588bb9d2-d747-43cb-8e9f-73d1961bebf1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:27:56 crc kubenswrapper[4744]: E1205 20:27:56.545672 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8" podUID="588bb9d2-d747-43cb-8e9f-73d1961bebf1" Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.545691 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-9t8ch" event={"ID":"4aae801a-e589-469a-b153-116744edc63b","Type":"ContainerStarted","Data":"3a91be37aa78103edc4fcccbe16727d8b2886fddd87c20858cc823e7d1cc6f44"} Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.547361 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mmzcc" event={"ID":"6c19a4e0-7d3f-44b2-9e16-8e7cdc24ab74","Type":"ContainerStarted","Data":"4c37959fed9078fb9ff0af83d5d013b1ce53ec39d5061921e4bf73bf9ca1496f"} Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.560230 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dffqv" event={"ID":"0d2bbe6b-5adb-402a-8ef6-d7be819d5b73","Type":"ContainerStarted","Data":"4e4424ee842cba7c909215e764f501546d90f0a3c74275bf174c86bb850f8c51"} Dec 05 20:27:56 crc kubenswrapper[4744]: E1205 20:27:56.561804 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k2b5x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-lbxdx_openstack-operators(564e07f4-0673-42d8-a7ee-68366706b2d4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:27:56 crc kubenswrapper[4744]: W1205 20:27:56.567778 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4ad9153_5de0_4bb5_a419_fe70e3099450.slice/crio-1416fbb867dd803f67ad92559d4b329d4d530f0c4b5f0ffa36d4ca06b95fb22d WatchSource:0}: Error finding container 1416fbb867dd803f67ad92559d4b329d4d530f0c4b5f0ffa36d4ca06b95fb22d: Status 404 returned error can't find the container with id 1416fbb867dd803f67ad92559d4b329d4d530f0c4b5f0ffa36d4ca06b95fb22d Dec 05 20:27:56 crc kubenswrapper[4744]: E1205 20:27:56.567928 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k2b5x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-lbxdx_openstack-operators(564e07f4-0673-42d8-a7ee-68366706b2d4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.567976 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m8844" event={"ID":"69215470-ec91-4d88-99f1-99117a543086","Type":"ContainerStarted","Data":"58c36741b0bb88b44acc130e9227b326909ed37fab920fe5b2c0601702c027dd"} Dec 05 20:27:56 crc kubenswrapper[4744]: E1205 20:27:56.569607 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx" podUID="564e07f4-0673-42d8-a7ee-68366706b2d4" Dec 05 20:27:56 crc kubenswrapper[4744]: E1205 20:27:56.571071 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pbdjn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-n4ljw_openstack-operators(a4ad9153-5de0-4bb5-a419-fe70e3099450): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.571382 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-879df" event={"ID":"022c2e13-58dd-42d3-a3a4-91a3eb74e0b5","Type":"ContainerStarted","Data":"46f5f45512ae63936b6ab3617ca1ae8a506cf27198c405dd6b85fcd396f1323a"} Dec 05 20:27:56 crc kubenswrapper[4744]: I1205 20:27:56.572309 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-kdhmb" event={"ID":"2c3d0695-b544-47a5-ad85-36f8fd2f1dcb","Type":"ContainerStarted","Data":"e610799bb197584ce4863337c357a7ac93ec3f475f092b32e1c930fa5bf22bd2"} Dec 05 20:27:56 crc kubenswrapper[4744]: E1205 20:27:56.572859 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pbdjn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-n4ljw_openstack-operators(a4ad9153-5de0-4bb5-a419-fe70e3099450): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:27:56 crc kubenswrapper[4744]: E1205 20:27:56.573112 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wbs2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-v879l_openstack-operators(5e4a6d16-0c89-4bd1-aa53-ce798baff113): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 20:27:56 crc kubenswrapper[4744]: E1205 20:27:56.574112 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw" podUID="a4ad9153-5de0-4bb5-a419-fe70e3099450" Dec 05 20:27:57 crc kubenswrapper[4744]: I1205 20:27:57.457055 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:27:57 crc kubenswrapper[4744]: E1205 20:27:57.457440 4744 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:27:57 crc kubenswrapper[4744]: E1205 20:27:57.457517 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs podName:63391739-cc08-49ea-be59-2c0740078450 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:01.457498131 +0000 UTC m=+1051.687309609 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs") pod "openstack-operator-controller-manager-7dc867b75-npt5k" (UID: "63391739-cc08-49ea-be59-2c0740078450") : secret "webhook-server-cert" not found Dec 05 20:27:57 crc kubenswrapper[4744]: I1205 20:27:57.457549 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:27:57 crc kubenswrapper[4744]: I1205 20:27:57.457663 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m\" (UID: \"43979be1-9cc5-445f-b079-b4504355cce4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:27:57 crc kubenswrapper[4744]: E1205 20:27:57.457861 4744 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:27:57 crc kubenswrapper[4744]: E1205 20:27:57.457888 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert podName:43979be1-9cc5-445f-b079-b4504355cce4 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:01.457879389 +0000 UTC m=+1051.687690757 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" (UID: "43979be1-9cc5-445f-b079-b4504355cce4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:27:57 crc kubenswrapper[4744]: E1205 20:27:57.457933 4744 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:27:57 crc kubenswrapper[4744]: E1205 20:27:57.457958 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs podName:63391739-cc08-49ea-be59-2c0740078450 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:01.457950351 +0000 UTC m=+1051.687761829 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs") pod "openstack-operator-controller-manager-7dc867b75-npt5k" (UID: "63391739-cc08-49ea-be59-2c0740078450") : secret "metrics-server-cert" not found Dec 05 20:27:57 crc kubenswrapper[4744]: I1205 20:27:57.767249 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8" event={"ID":"588bb9d2-d747-43cb-8e9f-73d1961bebf1","Type":"ContainerStarted","Data":"04debbf0b239aeaa7bd3aabfb429143088dfc0daa822618370d03dd7fa134456"} Dec 05 20:27:57 crc kubenswrapper[4744]: I1205 20:27:57.769869 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-v879l" event={"ID":"5e4a6d16-0c89-4bd1-aa53-ce798baff113","Type":"ContainerStarted","Data":"21f7e73004e082ff90798d66124a254e299d33ff5834918adfaaa2ffc8e90d52"} Dec 05 20:27:57 crc kubenswrapper[4744]: E1205 20:27:57.770122 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8" podUID="588bb9d2-d747-43cb-8e9f-73d1961bebf1" Dec 05 20:27:57 crc kubenswrapper[4744]: I1205 20:27:57.780347 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw" event={"ID":"a4ad9153-5de0-4bb5-a419-fe70e3099450","Type":"ContainerStarted","Data":"1416fbb867dd803f67ad92559d4b329d4d530f0c4b5f0ffa36d4ca06b95fb22d"} Dec 05 20:27:57 crc kubenswrapper[4744]: E1205 20:27:57.783685 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw" podUID="a4ad9153-5de0-4bb5-a419-fe70e3099450" Dec 05 20:27:57 crc kubenswrapper[4744]: I1205 20:27:57.786185 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn" event={"ID":"f3194afc-f21e-4fb0-bc31-5ac4b1b6e434","Type":"ContainerStarted","Data":"6d9fefcbb0071cc83a529ca3f7709865cc28ba817bcbedf92c7f75717022263b"} Dec 05 20:27:57 crc kubenswrapper[4744]: E1205 20:27:57.788906 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn" podUID="f3194afc-f21e-4fb0-bc31-5ac4b1b6e434" Dec 05 20:27:57 crc kubenswrapper[4744]: I1205 20:27:57.789859 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx" event={"ID":"564e07f4-0673-42d8-a7ee-68366706b2d4","Type":"ContainerStarted","Data":"840ac823d71f02cedde1194691d26491611309973aaf16821d94d774d4e9c9c1"} Dec 05 20:27:57 crc kubenswrapper[4744]: E1205 20:27:57.791853 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx" podUID="564e07f4-0673-42d8-a7ee-68366706b2d4" Dec 05 20:27:58 crc kubenswrapper[4744]: E1205 20:27:58.814895 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8" podUID="588bb9d2-d747-43cb-8e9f-73d1961bebf1" Dec 05 20:27:58 crc kubenswrapper[4744]: E1205 20:27:58.815856 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn" podUID="f3194afc-f21e-4fb0-bc31-5ac4b1b6e434" Dec 05 20:27:58 crc kubenswrapper[4744]: E1205 20:27:58.816000 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx" podUID="564e07f4-0673-42d8-a7ee-68366706b2d4" Dec 05 20:27:58 crc kubenswrapper[4744]: E1205 20:27:58.816052 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw" podUID="a4ad9153-5de0-4bb5-a419-fe70e3099450" Dec 05 20:28:00 crc kubenswrapper[4744]: I1205 20:28:00.355470 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert\") pod \"infra-operator-controller-manager-57548d458d-5bfxc\" (UID: \"341519c2-107a-440a-bfbb-af937e0c681f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:28:00 crc kubenswrapper[4744]: E1205 20:28:00.355968 4744 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:28:00 crc kubenswrapper[4744]: E1205 20:28:00.356020 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert podName:341519c2-107a-440a-bfbb-af937e0c681f nodeName:}" failed. No retries permitted until 2025-12-05 20:28:08.356003162 +0000 UTC m=+1058.585814530 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert") pod "infra-operator-controller-manager-57548d458d-5bfxc" (UID: "341519c2-107a-440a-bfbb-af937e0c681f") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:28:01 crc kubenswrapper[4744]: I1205 20:28:01.478857 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:28:01 crc kubenswrapper[4744]: I1205 20:28:01.478948 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:28:01 crc kubenswrapper[4744]: I1205 20:28:01.478998 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m\" (UID: \"43979be1-9cc5-445f-b079-b4504355cce4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:28:01 crc kubenswrapper[4744]: E1205 20:28:01.479051 4744 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:28:01 crc kubenswrapper[4744]: E1205 20:28:01.479104 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs podName:63391739-cc08-49ea-be59-2c0740078450 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:09.479089915 +0000 UTC m=+1059.708901283 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs") pod "openstack-operator-controller-manager-7dc867b75-npt5k" (UID: "63391739-cc08-49ea-be59-2c0740078450") : secret "webhook-server-cert" not found Dec 05 20:28:01 crc kubenswrapper[4744]: E1205 20:28:01.479144 4744 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:28:01 crc kubenswrapper[4744]: E1205 20:28:01.479176 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert podName:43979be1-9cc5-445f-b079-b4504355cce4 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:09.479166207 +0000 UTC m=+1059.708977575 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" (UID: "43979be1-9cc5-445f-b079-b4504355cce4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:28:01 crc kubenswrapper[4744]: E1205 20:28:01.479221 4744 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:28:01 crc kubenswrapper[4744]: E1205 20:28:01.479247 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs podName:63391739-cc08-49ea-be59-2c0740078450 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:09.479237628 +0000 UTC m=+1059.709049006 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs") pod "openstack-operator-controller-manager-7dc867b75-npt5k" (UID: "63391739-cc08-49ea-be59-2c0740078450") : secret "metrics-server-cert" not found Dec 05 20:28:08 crc kubenswrapper[4744]: I1205 20:28:08.411787 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert\") pod \"infra-operator-controller-manager-57548d458d-5bfxc\" (UID: \"341519c2-107a-440a-bfbb-af937e0c681f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:28:08 crc kubenswrapper[4744]: E1205 20:28:08.412415 4744 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 20:28:08 crc kubenswrapper[4744]: E1205 20:28:08.412458 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert podName:341519c2-107a-440a-bfbb-af937e0c681f nodeName:}" failed. No retries permitted until 2025-12-05 20:28:24.412445085 +0000 UTC m=+1074.642256453 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert") pod "infra-operator-controller-manager-57548d458d-5bfxc" (UID: "341519c2-107a-440a-bfbb-af937e0c681f") : secret "infra-operator-webhook-server-cert" not found Dec 05 20:28:09 crc kubenswrapper[4744]: I1205 20:28:09.536200 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:28:09 crc kubenswrapper[4744]: I1205 20:28:09.536274 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:28:09 crc kubenswrapper[4744]: I1205 20:28:09.536334 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m\" (UID: \"43979be1-9cc5-445f-b079-b4504355cce4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:28:09 crc kubenswrapper[4744]: E1205 20:28:09.536444 4744 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 20:28:09 crc kubenswrapper[4744]: E1205 20:28:09.536494 4744 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:28:09 crc kubenswrapper[4744]: E1205 20:28:09.536491 4744 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 20:28:09 crc kubenswrapper[4744]: E1205 20:28:09.536526 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs podName:63391739-cc08-49ea-be59-2c0740078450 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:25.53650377 +0000 UTC m=+1075.766315148 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs") pod "openstack-operator-controller-manager-7dc867b75-npt5k" (UID: "63391739-cc08-49ea-be59-2c0740078450") : secret "webhook-server-cert" not found Dec 05 20:28:09 crc kubenswrapper[4744]: E1205 20:28:09.536549 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert podName:43979be1-9cc5-445f-b079-b4504355cce4 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:25.536540401 +0000 UTC m=+1075.766351779 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" (UID: "43979be1-9cc5-445f-b079-b4504355cce4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 20:28:09 crc kubenswrapper[4744]: E1205 20:28:09.536576 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs podName:63391739-cc08-49ea-be59-2c0740078450 nodeName:}" failed. No retries permitted until 2025-12-05 20:28:25.536556791 +0000 UTC m=+1075.766368159 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs") pod "openstack-operator-controller-manager-7dc867b75-npt5k" (UID: "63391739-cc08-49ea-be59-2c0740078450") : secret "metrics-server-cert" not found Dec 05 20:28:10 crc kubenswrapper[4744]: E1205 20:28:10.880032 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 05 20:28:10 crc kubenswrapper[4744]: E1205 20:28:10.880632 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xq7z2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-dffqv_openstack-operators(0d2bbe6b-5adb-402a-8ef6-d7be819d5b73): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:28:12 crc kubenswrapper[4744]: E1205 20:28:12.689967 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d" Dec 05 20:28:12 crc kubenswrapper[4744]: E1205 20:28:12.690141 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w4j7r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-9t8ch_openstack-operators(4aae801a-e589-469a-b153-116744edc63b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:28:13 crc kubenswrapper[4744]: E1205 20:28:13.120866 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/openstack-k8s-operators/watcher-operator:9e2b1e4b7b3896a4c4f152962f74457a6de43346" Dec 05 20:28:13 crc kubenswrapper[4744]: E1205 20:28:13.120951 4744 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/openstack-k8s-operators/watcher-operator:9e2b1e4b7b3896a4c4f152962f74457a6de43346" Dec 05 20:28:13 crc kubenswrapper[4744]: E1205 20:28:13.121074 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.9:5001/openstack-k8s-operators/watcher-operator:9e2b1e4b7b3896a4c4f152962f74457a6de43346,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p6phz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7f6cb9b975-hx6rp_openstack-operators(9f0e37bb-46f6-45de-a562-ee1ce4d89c74): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:28:13 crc kubenswrapper[4744]: E1205 20:28:13.642497 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 05 20:28:13 crc kubenswrapper[4744]: E1205 20:28:13.642665 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qn47l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-lh9hc_openstack-operators(41b48b2f-7b8d-46df-a226-6c163e4f57b0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:28:13 crc kubenswrapper[4744]: E1205 20:28:13.644227 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lh9hc" podUID="41b48b2f-7b8d-46df-a226-6c163e4f57b0" Dec 05 20:28:13 crc kubenswrapper[4744]: E1205 20:28:13.973392 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lh9hc" podUID="41b48b2f-7b8d-46df-a226-6c163e4f57b0" Dec 05 20:28:15 crc kubenswrapper[4744]: E1205 20:28:15.333829 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 05 20:28:15 crc kubenswrapper[4744]: E1205 20:28:15.334064 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-spx26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-57gjb_openstack-operators(021ea569-b351-4d31-8080-75f5ec005daa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:28:19 crc kubenswrapper[4744]: I1205 20:28:19.807323 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:28:19 crc kubenswrapper[4744]: I1205 20:28:19.807685 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:28:19 crc kubenswrapper[4744]: I1205 20:28:19.807748 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:28:19 crc kubenswrapper[4744]: I1205 20:28:19.808267 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7361719f1aaa6a0025abf0bbccc7737602f9bbc3dfb06fc01d1de9cb17c502bc"} pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:28:19 crc kubenswrapper[4744]: I1205 20:28:19.808336 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" containerID="cri-o://7361719f1aaa6a0025abf0bbccc7737602f9bbc3dfb06fc01d1de9cb17c502bc" gracePeriod=600 Dec 05 20:28:20 crc kubenswrapper[4744]: I1205 20:28:20.024648 4744 generic.go:334] "Generic (PLEG): container finished" podID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerID="7361719f1aaa6a0025abf0bbccc7737602f9bbc3dfb06fc01d1de9cb17c502bc" exitCode=0 Dec 05 20:28:20 crc kubenswrapper[4744]: I1205 20:28:20.024688 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerDied","Data":"7361719f1aaa6a0025abf0bbccc7737602f9bbc3dfb06fc01d1de9cb17c502bc"} Dec 05 20:28:20 crc kubenswrapper[4744]: I1205 20:28:20.024719 4744 scope.go:117] "RemoveContainer" containerID="fcebdaf5fbdada46a4c4fdee6dfda24df67a9ddab7d4a2219b461c1be76e2942" Dec 05 20:28:23 crc kubenswrapper[4744]: I1205 20:28:23.051716 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-879df" event={"ID":"022c2e13-58dd-42d3-a3a4-91a3eb74e0b5","Type":"ContainerStarted","Data":"95df15e5aa70d410ca32c606eb726c04298b2f659cac0660e8c11fd048c73c04"} Dec 05 20:28:23 crc kubenswrapper[4744]: I1205 20:28:23.056418 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kj76m" event={"ID":"b82be17d-c46f-4d8d-9264-d51d1b2ef12e","Type":"ContainerStarted","Data":"4aa785bb2baa4be135917df988f818b62e2534b601d899ce33101070f5e453a2"} Dec 05 20:28:23 crc kubenswrapper[4744]: I1205 20:28:23.061890 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8pp8k" event={"ID":"e038d15c-67e9-4551-b13b-c541b4b76827","Type":"ContainerStarted","Data":"8f4fb646f6fc198564fb016cd6b379cd4ebd8e5b3c4ad6f9007a63328d3c7cf7"} Dec 05 20:28:23 crc kubenswrapper[4744]: I1205 20:28:23.065810 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-kdhmb" event={"ID":"2c3d0695-b544-47a5-ad85-36f8fd2f1dcb","Type":"ContainerStarted","Data":"650ce74de22c1f2a1f087ad2a9904d60c16888b2b379bbb1a80463b5a42c649b"} Dec 05 20:28:23 crc kubenswrapper[4744]: I1205 20:28:23.067058 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mmzcc" event={"ID":"6c19a4e0-7d3f-44b2-9e16-8e7cdc24ab74","Type":"ContainerStarted","Data":"fd26a2ce0c4b6185397a6b3f264f0d2918230ab174938815151480e8bd2121b2"} Dec 05 20:28:23 crc kubenswrapper[4744]: I1205 20:28:23.068970 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rw45s" event={"ID":"3c35e949-b9a5-4f22-a2db-7a2f27b4bb8f","Type":"ContainerStarted","Data":"9a82fd97d252b66bcf8c6fa9454f35dd9ec93d7ba7d15fb62ffe846a1b5f4e95"} Dec 05 20:28:24 crc kubenswrapper[4744]: I1205 20:28:24.506514 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert\") pod \"infra-operator-controller-manager-57548d458d-5bfxc\" (UID: \"341519c2-107a-440a-bfbb-af937e0c681f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:28:24 crc kubenswrapper[4744]: I1205 20:28:24.516819 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341519c2-107a-440a-bfbb-af937e0c681f-cert\") pod \"infra-operator-controller-manager-57548d458d-5bfxc\" (UID: \"341519c2-107a-440a-bfbb-af937e0c681f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:28:24 crc kubenswrapper[4744]: I1205 20:28:24.734378 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:28:25 crc kubenswrapper[4744]: I1205 20:28:25.245723 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc"] Dec 05 20:28:25 crc kubenswrapper[4744]: I1205 20:28:25.624819 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:28:25 crc kubenswrapper[4744]: I1205 20:28:25.624892 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:28:25 crc kubenswrapper[4744]: I1205 20:28:25.624945 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m\" (UID: \"43979be1-9cc5-445f-b079-b4504355cce4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:28:25 crc kubenswrapper[4744]: I1205 20:28:25.630829 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-webhook-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:28:25 crc kubenswrapper[4744]: I1205 20:28:25.636965 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43979be1-9cc5-445f-b079-b4504355cce4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m\" (UID: \"43979be1-9cc5-445f-b079-b4504355cce4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:28:25 crc kubenswrapper[4744]: I1205 20:28:25.646388 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63391739-cc08-49ea-be59-2c0740078450-metrics-certs\") pod \"openstack-operator-controller-manager-7dc867b75-npt5k\" (UID: \"63391739-cc08-49ea-be59-2c0740078450\") " pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:28:25 crc kubenswrapper[4744]: I1205 20:28:25.758949 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:28:25 crc kubenswrapper[4744]: I1205 20:28:25.808341 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:28:26 crc kubenswrapper[4744]: I1205 20:28:26.098476 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m8844" event={"ID":"69215470-ec91-4d88-99f1-99117a543086","Type":"ContainerStarted","Data":"6a254d16f37fef872d7388fef70cebfa2b0d810736b5dba983069e0ff5114a34"} Dec 05 20:28:26 crc kubenswrapper[4744]: I1205 20:28:26.100011 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx" event={"ID":"564e07f4-0673-42d8-a7ee-68366706b2d4","Type":"ContainerStarted","Data":"236984adaaf9804aca57ed4bf488e15fc8017dbaaf1c6c31ad370b133048fa55"} Dec 05 20:28:26 crc kubenswrapper[4744]: I1205 20:28:26.101507 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8" event={"ID":"588bb9d2-d747-43cb-8e9f-73d1961bebf1","Type":"ContainerStarted","Data":"b10a7f9a691edf826e18c97d0ee44330a48bcc25965db2ddeda67a0e0b618ab5"} Dec 05 20:28:26 crc kubenswrapper[4744]: I1205 20:28:26.115506 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" event={"ID":"341519c2-107a-440a-bfbb-af937e0c681f","Type":"ContainerStarted","Data":"b3a02cd8f022dbc8eee5d5d80b4cf7a555e908f7f5a06ef5d3fb5544eed1d444"} Dec 05 20:28:26 crc kubenswrapper[4744]: I1205 20:28:26.117323 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4dm28" event={"ID":"8656f7db-cb1e-40fa-ba97-93a647f869ac","Type":"ContainerStarted","Data":"b5601429474ce8335800c44ef68f51e4a56c253d3b1bc81ca6d47ca52fae1196"} Dec 05 20:28:26 crc kubenswrapper[4744]: I1205 20:28:26.118872 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw" event={"ID":"a4ad9153-5de0-4bb5-a419-fe70e3099450","Type":"ContainerStarted","Data":"2e0068410d3e75d792bd5bdc277ee37994e0be47ff222e1993bcc869c6b9963b"} Dec 05 20:28:26 crc kubenswrapper[4744]: I1205 20:28:26.120221 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-b4sx5" event={"ID":"15e7fd30-4c0d-45f6-8905-ab235fc32e16","Type":"ContainerStarted","Data":"600b7f33f6f2ffcb3f958366da25aa8e6a5ebbfe4b6516fe5d81a520947e855a"} Dec 05 20:28:26 crc kubenswrapper[4744]: I1205 20:28:26.121470 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn" event={"ID":"f3194afc-f21e-4fb0-bc31-5ac4b1b6e434","Type":"ContainerStarted","Data":"2b653e474c0a2ee3375fa3f2ede6f8c0cf6e009f703ff97589b794c9d125b172"} Dec 05 20:28:26 crc kubenswrapper[4744]: I1205 20:28:26.123502 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerStarted","Data":"52a7f6284055fc7f936355b093cc061c593ac88f5c9486e893ae19c6a9299d8d"} Dec 05 20:28:26 crc kubenswrapper[4744]: I1205 20:28:26.124738 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4b9p2" event={"ID":"807989ca-0470-47bc-8bef-9c1dd35e4bb0","Type":"ContainerStarted","Data":"793b96414647270151015c9efb64a5f22386ad3020e711f51045de924619c7c8"} Dec 05 20:28:29 crc kubenswrapper[4744]: E1205 20:28:29.019094 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 20:28:29 crc kubenswrapper[4744]: E1205 20:28:29.020069 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wbs2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-v879l_openstack-operators(5e4a6d16-0c89-4bd1-aa53-ce798baff113): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:28:29 crc kubenswrapper[4744]: E1205 20:28:29.021532 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-v879l" podUID="5e4a6d16-0c89-4bd1-aa53-ce798baff113" Dec 05 20:28:44 crc kubenswrapper[4744]: E1205 20:28:44.226380 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 20:28:44 crc kubenswrapper[4744]: E1205 20:28:44.227100 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xq7z2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-dffqv_openstack-operators(0d2bbe6b-5adb-402a-8ef6-d7be819d5b73): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:28:44 crc kubenswrapper[4744]: E1205 20:28:44.228724 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dffqv" podUID="0d2bbe6b-5adb-402a-8ef6-d7be819d5b73" Dec 05 20:28:53 crc kubenswrapper[4744]: E1205 20:28:53.505190 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7" Dec 05 20:28:53 crc kubenswrapper[4744]: E1205 20:28:53.505834 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-42hqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-57548d458d-5bfxc_openstack-operators(341519c2-107a-440a-bfbb-af937e0c681f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:28:53 crc kubenswrapper[4744]: E1205 20:28:53.844270 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 20:28:53 crc kubenswrapper[4744]: E1205 20:28:53.844492 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p6phz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7f6cb9b975-hx6rp_openstack-operators(9f0e37bb-46f6-45de-a562-ee1ce4d89c74): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:28:53 crc kubenswrapper[4744]: E1205 20:28:53.845897 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" podUID="9f0e37bb-46f6-45de-a562-ee1ce4d89c74" Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.383016 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k"] Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.462650 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m"] Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.470607 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" event={"ID":"63391739-cc08-49ea-be59-2c0740078450","Type":"ContainerStarted","Data":"d019daf342422f276f1495a05ea2b8844defe664b49b29fdbafc1ad8c015ea94"} Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.476860 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mmzcc" event={"ID":"6c19a4e0-7d3f-44b2-9e16-8e7cdc24ab74","Type":"ContainerStarted","Data":"e10f31d97a7dc9537cb5a9a73002d9fcff317e4c99d57a1ea8172691ddcef327"} Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.478053 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mmzcc" Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.480135 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mmzcc" Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.483053 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4dm28" event={"ID":"8656f7db-cb1e-40fa-ba97-93a647f869ac","Type":"ContainerStarted","Data":"65a49bd5a21f53e688357a4f2e91d3a3f4b65d3affcdc40004b38e631454f7ef"} Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.483898 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4dm28" Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.485524 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4dm28" Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.496882 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lh9hc" event={"ID":"41b48b2f-7b8d-46df-a226-6c163e4f57b0","Type":"ContainerStarted","Data":"2561e7a5adcbb4f5fdcba0c0c343ccad696dc3e6ef2ca54a0d06cc6acd02aa4e"} Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.510127 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-v879l" event={"ID":"5e4a6d16-0c89-4bd1-aa53-ce798baff113","Type":"ContainerStarted","Data":"8ed4588b7ca95b854c412dde16a8728f4dcea7d06198a653535a7ada398a7543"} Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.511953 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dffqv" event={"ID":"0d2bbe6b-5adb-402a-8ef6-d7be819d5b73","Type":"ContainerStarted","Data":"52d48e8127e37af27e208bf67a1d101cae8dd0acef3ce622ca3c03d5ec77e7f8"} Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.513834 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mmzcc" podStartSLOduration=5.936210834 podStartE2EDuration="1m4.513814333s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:55.992442984 +0000 UTC m=+1046.222254352" lastFinishedPulling="2025-12-05 20:28:54.570046463 +0000 UTC m=+1104.799857851" observedRunningTime="2025-12-05 20:28:56.499410827 +0000 UTC m=+1106.729222195" watchObservedRunningTime="2025-12-05 20:28:56.513814333 +0000 UTC m=+1106.743625701" Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.515158 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn" event={"ID":"f3194afc-f21e-4fb0-bc31-5ac4b1b6e434","Type":"ContainerStarted","Data":"6483580f03cb564d5196115e501b95d8fc5f32c7b9a2efac5ca38dcc78cf440a"} Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.515320 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn" Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.517020 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m8844" event={"ID":"69215470-ec91-4d88-99f1-99117a543086","Type":"ContainerStarted","Data":"bb95449ced28797a8c19c518f34b725e5d4b8f6047420a69c118251c674e44d9"} Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.517075 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn" Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.517312 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m8844" Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.518770 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m8844" Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.531406 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lh9hc" podStartSLOduration=5.41856522 podStartE2EDuration="1m3.531391437s" podCreationTimestamp="2025-12-05 20:27:53 +0000 UTC" firstStartedPulling="2025-12-05 20:27:56.450401737 +0000 UTC m=+1046.680213105" lastFinishedPulling="2025-12-05 20:28:54.563227934 +0000 UTC m=+1104.793039322" observedRunningTime="2025-12-05 20:28:56.529307486 +0000 UTC m=+1106.759118854" watchObservedRunningTime="2025-12-05 20:28:56.531391437 +0000 UTC m=+1106.761202805" Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.554033 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4dm28" podStartSLOduration=4.654915835 podStartE2EDuration="1m4.554015117s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:56.0619874 +0000 UTC m=+1046.291798768" lastFinishedPulling="2025-12-05 20:28:55.961086682 +0000 UTC m=+1106.190898050" observedRunningTime="2025-12-05 20:28:56.552737995 +0000 UTC m=+1106.782549363" watchObservedRunningTime="2025-12-05 20:28:56.554015117 +0000 UTC m=+1106.783826485" Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.598046 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m8844" podStartSLOduration=4.652170864 podStartE2EDuration="1m4.598029364s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:55.973720227 +0000 UTC m=+1046.203531595" lastFinishedPulling="2025-12-05 20:28:55.919578717 +0000 UTC m=+1106.149390095" observedRunningTime="2025-12-05 20:28:56.587006782 +0000 UTC m=+1106.816818180" watchObservedRunningTime="2025-12-05 20:28:56.598029364 +0000 UTC m=+1106.827840732" Dec 05 20:28:56 crc kubenswrapper[4744]: I1205 20:28:56.615536 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-6pdfn" podStartSLOduration=5.103512605 podStartE2EDuration="1m4.615521307s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:56.520515886 +0000 UTC m=+1046.750327254" lastFinishedPulling="2025-12-05 20:28:56.032524588 +0000 UTC m=+1106.262335956" observedRunningTime="2025-12-05 20:28:56.612533482 +0000 UTC m=+1106.842344840" watchObservedRunningTime="2025-12-05 20:28:56.615521307 +0000 UTC m=+1106.845332675" Dec 05 20:28:56 crc kubenswrapper[4744]: E1205 20:28:56.883104 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-57gjb" podUID="021ea569-b351-4d31-8080-75f5ec005daa" Dec 05 20:28:56 crc kubenswrapper[4744]: E1205 20:28:56.937133 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-9t8ch" podUID="4aae801a-e589-469a-b153-116744edc63b" Dec 05 20:28:57 crc kubenswrapper[4744]: E1205 20:28:57.028589 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" podUID="341519c2-107a-440a-bfbb-af937e0c681f" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.524916 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" event={"ID":"341519c2-107a-440a-bfbb-af937e0c681f","Type":"ContainerStarted","Data":"acdc3cf3923f557189c9af694fa6e643afbefe8b3acd3d4503ad6bf8e5d176f3"} Dec 05 20:28:57 crc kubenswrapper[4744]: E1205 20:28:57.528894 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7\\\"\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" podUID="341519c2-107a-440a-bfbb-af937e0c681f" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.531933 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw" event={"ID":"a4ad9153-5de0-4bb5-a419-fe70e3099450","Type":"ContainerStarted","Data":"451e5e8117d67e131e222975175ceae865f866d034d78098119c61eeb35d1530"} Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.532767 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.535446 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.537797 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rw45s" event={"ID":"3c35e949-b9a5-4f22-a2db-7a2f27b4bb8f","Type":"ContainerStarted","Data":"a186685f42462321bd9e51e08fbb2203b034d34c3530862da92025043edabee2"} Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.538337 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rw45s" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.540960 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rw45s" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.552003 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" event={"ID":"43979be1-9cc5-445f-b079-b4504355cce4","Type":"ContainerStarted","Data":"364deb56c5d264f8886f47905e7bcdba3d378b7d5051e9c6b5b956fbf4522f44"} Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.573491 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-b4sx5" event={"ID":"15e7fd30-4c0d-45f6-8905-ab235fc32e16","Type":"ContainerStarted","Data":"c6de49bf92ca7357fd243be32d107b1982073d69f6da43f6ad677ca71dbcfd63"} Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.574320 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-b4sx5" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.578171 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n4ljw" podStartSLOduration=6.144886498 podStartE2EDuration="1m5.578155368s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:56.570948908 +0000 UTC m=+1046.800760276" lastFinishedPulling="2025-12-05 20:28:56.004217778 +0000 UTC m=+1106.234029146" observedRunningTime="2025-12-05 20:28:57.577608815 +0000 UTC m=+1107.807420183" watchObservedRunningTime="2025-12-05 20:28:57.578155368 +0000 UTC m=+1107.807966736" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.589352 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-b4sx5" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.592532 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-9t8ch" event={"ID":"4aae801a-e589-469a-b153-116744edc63b","Type":"ContainerStarted","Data":"985a17599036c96fcdcc41573af49cc495acae80aee438a805afacb5cc23745c"} Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.650459 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rw45s" podStartSLOduration=5.091419181 podStartE2EDuration="1m5.650437975s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:55.401413882 +0000 UTC m=+1045.631225250" lastFinishedPulling="2025-12-05 20:28:55.960432676 +0000 UTC m=+1106.190244044" observedRunningTime="2025-12-05 20:28:57.648853626 +0000 UTC m=+1107.878664994" watchObservedRunningTime="2025-12-05 20:28:57.650437975 +0000 UTC m=+1107.880249343" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.669384 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kj76m" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.669619 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kj76m" event={"ID":"b82be17d-c46f-4d8d-9264-d51d1b2ef12e","Type":"ContainerStarted","Data":"5bf6c6147ce58996e2dc880b291ca94190e29c460e19a7bfb78eb81400af3e05"} Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.683556 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kj76m" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.694332 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-kdhmb" event={"ID":"2c3d0695-b544-47a5-ad85-36f8fd2f1dcb","Type":"ContainerStarted","Data":"2971973104aea299f5fd9aa9bd4292ff2e555128a12fec62e3d7eb4eeeb9d924"} Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.695465 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-kdhmb" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.709178 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-kdhmb" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.720660 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-57gjb" event={"ID":"021ea569-b351-4d31-8080-75f5ec005daa","Type":"ContainerStarted","Data":"884d8b942cf7d3d5e0981707d75ed289060893249ce44f051d1f66f49adbee55"} Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.760931 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx" event={"ID":"564e07f4-0673-42d8-a7ee-68366706b2d4","Type":"ContainerStarted","Data":"ade397963fce08c67ad02054b77f133697e78955f194ea77a51429633b06e93c"} Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.761388 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.762719 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-b4sx5" podStartSLOduration=6.08325226 podStartE2EDuration="1m5.7627076s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:56.238136967 +0000 UTC m=+1046.467948335" lastFinishedPulling="2025-12-05 20:28:55.917592307 +0000 UTC m=+1106.147403675" observedRunningTime="2025-12-05 20:28:57.731135689 +0000 UTC m=+1107.960947057" watchObservedRunningTime="2025-12-05 20:28:57.7627076 +0000 UTC m=+1107.992518968" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.763885 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.772520 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kj76m" podStartSLOduration=4.507407516 podStartE2EDuration="1m5.772494921s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:54.55409761 +0000 UTC m=+1044.783908978" lastFinishedPulling="2025-12-05 20:28:55.819185005 +0000 UTC m=+1106.048996383" observedRunningTime="2025-12-05 20:28:57.759921741 +0000 UTC m=+1107.989733119" watchObservedRunningTime="2025-12-05 20:28:57.772494921 +0000 UTC m=+1108.002306299" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.797829 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-879df" event={"ID":"022c2e13-58dd-42d3-a3a4-91a3eb74e0b5","Type":"ContainerStarted","Data":"873689e0d64b6fdcd5ebb6921d754e507b1e65656496ad61efd29df2297100a2"} Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.798108 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-879df" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.814849 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-kdhmb" podStartSLOduration=6.247176929 podStartE2EDuration="1m5.814826007s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:56.43657735 +0000 UTC m=+1046.666388718" lastFinishedPulling="2025-12-05 20:28:56.004226418 +0000 UTC m=+1106.234037796" observedRunningTime="2025-12-05 20:28:57.797879079 +0000 UTC m=+1108.027690447" watchObservedRunningTime="2025-12-05 20:28:57.814826007 +0000 UTC m=+1108.044637375" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.820997 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-879df" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.831201 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" event={"ID":"9f0e37bb-46f6-45de-a562-ee1ce4d89c74","Type":"ContainerStarted","Data":"53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e"} Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.845777 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lbxdx" podStartSLOduration=6.345482698 podStartE2EDuration="1m5.845760782s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:56.56165244 +0000 UTC m=+1046.791463808" lastFinishedPulling="2025-12-05 20:28:56.061930524 +0000 UTC m=+1106.291741892" observedRunningTime="2025-12-05 20:28:57.84078842 +0000 UTC m=+1108.070599788" watchObservedRunningTime="2025-12-05 20:28:57.845760782 +0000 UTC m=+1108.075572150" Dec 05 20:28:57 crc kubenswrapper[4744]: I1205 20:28:57.883641 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-879df" podStartSLOduration=5.939650188 podStartE2EDuration="1m5.883620428s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:56.061624962 +0000 UTC m=+1046.291436330" lastFinishedPulling="2025-12-05 20:28:56.005595202 +0000 UTC m=+1106.235406570" observedRunningTime="2025-12-05 20:28:57.875797535 +0000 UTC m=+1108.105608903" watchObservedRunningTime="2025-12-05 20:28:57.883620428 +0000 UTC m=+1108.113431796" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.840562 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dffqv" event={"ID":"0d2bbe6b-5adb-402a-8ef6-d7be819d5b73","Type":"ContainerStarted","Data":"f71a0743a9a1cc053f3d76bf1e2cf1c1e1611481277596dca71e8d636f82b778"} Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.840875 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dffqv" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.844621 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-9t8ch" event={"ID":"4aae801a-e589-469a-b153-116744edc63b","Type":"ContainerStarted","Data":"810f0bab8d6061d4b2e01e943ec96a2fd42ead9667a988b43de06bf7f7753f34"} Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.844705 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-9t8ch" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.854708 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8pp8k" event={"ID":"e038d15c-67e9-4551-b13b-c541b4b76827","Type":"ContainerStarted","Data":"55d589cb53b71f5657738a608c8d233bd3c4cddf04b5524562cb5aef819a22ec"} Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.854927 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8pp8k" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.857028 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8pp8k" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.861471 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dffqv" podStartSLOduration=7.030621323 podStartE2EDuration="1m6.861456845s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:56.06469392 +0000 UTC m=+1046.294505288" lastFinishedPulling="2025-12-05 20:28:55.895529442 +0000 UTC m=+1106.125340810" observedRunningTime="2025-12-05 20:28:58.856267997 +0000 UTC m=+1109.086079365" watchObservedRunningTime="2025-12-05 20:28:58.861456845 +0000 UTC m=+1109.091268213" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.861707 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-57gjb" event={"ID":"021ea569-b351-4d31-8080-75f5ec005daa","Type":"ContainerStarted","Data":"a6b217f0b3984ca453462c14b99bc7ceb5d7600440657575e46dba9f2fca4d0b"} Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.862450 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-57gjb" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.865194 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-v879l" event={"ID":"5e4a6d16-0c89-4bd1-aa53-ce798baff113","Type":"ContainerStarted","Data":"6d02232faae796f8a4eba3ef326f48fc52e9f90584a3382ed07ba9a54ed96871"} Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.865844 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-v879l" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.872948 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4b9p2" event={"ID":"807989ca-0470-47bc-8bef-9c1dd35e4bb0","Type":"ContainerStarted","Data":"2a7f3213a8ce3da9b1d16937988591a59ddb5f24211163176ecd98f8aff69c6a"} Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.874734 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4b9p2" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.876234 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4b9p2" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.877492 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" event={"ID":"9f0e37bb-46f6-45de-a562-ee1ce4d89c74","Type":"ContainerStarted","Data":"04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8"} Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.877744 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.886235 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8pp8k" podStartSLOduration=6.272731121 podStartE2EDuration="1m6.886218197s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:55.346700955 +0000 UTC m=+1045.576512313" lastFinishedPulling="2025-12-05 20:28:55.960188021 +0000 UTC m=+1106.189999389" observedRunningTime="2025-12-05 20:28:58.881531171 +0000 UTC m=+1109.111342559" watchObservedRunningTime="2025-12-05 20:28:58.886218197 +0000 UTC m=+1109.116029565" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.886785 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" event={"ID":"63391739-cc08-49ea-be59-2c0740078450","Type":"ContainerStarted","Data":"39d94b89c563ab754dbf91b579c32f3e1cffd8d464af7a38ba08184323a355d8"} Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.887460 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.896667 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8" event={"ID":"588bb9d2-d747-43cb-8e9f-73d1961bebf1","Type":"ContainerStarted","Data":"fc02e8dd4089142feeb7be3d18616bd7e1efb409efd8f84ddcaff73d45af4d56"} Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.898669 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8" Dec 05 20:28:58 crc kubenswrapper[4744]: E1205 20:28:58.902512 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7\\\"\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" podUID="341519c2-107a-440a-bfbb-af937e0c681f" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.903086 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.925160 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-9t8ch" podStartSLOduration=5.012419233 podStartE2EDuration="1m6.925141119s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:56.238619408 +0000 UTC m=+1046.468430786" lastFinishedPulling="2025-12-05 20:28:58.151341304 +0000 UTC m=+1108.381152672" observedRunningTime="2025-12-05 20:28:58.918997887 +0000 UTC m=+1109.148809265" watchObservedRunningTime="2025-12-05 20:28:58.925141119 +0000 UTC m=+1109.154952487" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.944038 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4b9p2" podStartSLOduration=7.046718653 podStartE2EDuration="1m6.944021536s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:56.108352711 +0000 UTC m=+1046.338164079" lastFinishedPulling="2025-12-05 20:28:56.005655594 +0000 UTC m=+1106.235466962" observedRunningTime="2025-12-05 20:28:58.940526589 +0000 UTC m=+1109.170337957" watchObservedRunningTime="2025-12-05 20:28:58.944021536 +0000 UTC m=+1109.173832904" Dec 05 20:28:58 crc kubenswrapper[4744]: I1205 20:28:58.990377 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" podStartSLOduration=65.990353431 podStartE2EDuration="1m5.990353431s" podCreationTimestamp="2025-12-05 20:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:28:58.979985514 +0000 UTC m=+1109.209796882" watchObservedRunningTime="2025-12-05 20:28:58.990353431 +0000 UTC m=+1109.220164799" Dec 05 20:28:59 crc kubenswrapper[4744]: I1205 20:28:59.023044 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" podStartSLOduration=7.110114574 podStartE2EDuration="1m7.023025098s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:56.118663031 +0000 UTC m=+1046.348474399" lastFinishedPulling="2025-12-05 20:28:56.031573555 +0000 UTC m=+1106.261384923" observedRunningTime="2025-12-05 20:28:59.008380707 +0000 UTC m=+1109.238192095" watchObservedRunningTime="2025-12-05 20:28:59.023025098 +0000 UTC m=+1109.252836466" Dec 05 20:28:59 crc kubenswrapper[4744]: I1205 20:28:59.103973 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-v879l" podStartSLOduration=7.671896382 podStartE2EDuration="1m7.103956959s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:56.573031793 +0000 UTC m=+1046.802843161" lastFinishedPulling="2025-12-05 20:28:56.00509237 +0000 UTC m=+1106.234903738" observedRunningTime="2025-12-05 20:28:59.100539614 +0000 UTC m=+1109.330350992" watchObservedRunningTime="2025-12-05 20:28:59.103956959 +0000 UTC m=+1109.333768327" Dec 05 20:28:59 crc kubenswrapper[4744]: I1205 20:28:59.127257 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-qqjj8" podStartSLOduration=7.673875939 podStartE2EDuration="1m7.127232363s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:56.541520883 +0000 UTC m=+1046.771332251" lastFinishedPulling="2025-12-05 20:28:55.994877307 +0000 UTC m=+1106.224688675" observedRunningTime="2025-12-05 20:28:59.117595546 +0000 UTC m=+1109.347406914" watchObservedRunningTime="2025-12-05 20:28:59.127232363 +0000 UTC m=+1109.357043741" Dec 05 20:28:59 crc kubenswrapper[4744]: I1205 20:28:59.143635 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-57gjb" podStartSLOduration=5.081213793 podStartE2EDuration="1m7.143615458s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:27:56.258617183 +0000 UTC m=+1046.488428561" lastFinishedPulling="2025-12-05 20:28:58.321018858 +0000 UTC m=+1108.550830226" observedRunningTime="2025-12-05 20:28:59.140380779 +0000 UTC m=+1109.370192157" watchObservedRunningTime="2025-12-05 20:28:59.143615458 +0000 UTC m=+1109.373426826" Dec 05 20:29:01 crc kubenswrapper[4744]: I1205 20:29:01.921633 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" event={"ID":"43979be1-9cc5-445f-b079-b4504355cce4","Type":"ContainerStarted","Data":"7c8a2da59532e4e3071c84e0a5d275595a518579a08e56882bd92004c5505da6"} Dec 05 20:29:01 crc kubenswrapper[4744]: I1205 20:29:01.922012 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:29:01 crc kubenswrapper[4744]: I1205 20:29:01.922043 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" event={"ID":"43979be1-9cc5-445f-b079-b4504355cce4","Type":"ContainerStarted","Data":"5b6c1468d704a7b3081049a8ad1c82722db963e8b4e6b60f4286a51c65245cfa"} Dec 05 20:29:01 crc kubenswrapper[4744]: I1205 20:29:01.952953 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" podStartSLOduration=65.872697569 podStartE2EDuration="1m9.952932942s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:28:56.677451587 +0000 UTC m=+1106.907262955" lastFinishedPulling="2025-12-05 20:29:00.75768695 +0000 UTC m=+1110.987498328" observedRunningTime="2025-12-05 20:29:01.947106627 +0000 UTC m=+1112.176918005" watchObservedRunningTime="2025-12-05 20:29:01.952932942 +0000 UTC m=+1112.182744310" Dec 05 20:29:02 crc kubenswrapper[4744]: I1205 20:29:02.990142 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dffqv" Dec 05 20:29:03 crc kubenswrapper[4744]: I1205 20:29:03.013219 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-57gjb" Dec 05 20:29:03 crc kubenswrapper[4744]: I1205 20:29:03.586399 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-9t8ch" Dec 05 20:29:03 crc kubenswrapper[4744]: I1205 20:29:03.598814 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-v879l" Dec 05 20:29:03 crc kubenswrapper[4744]: I1205 20:29:03.633563 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" Dec 05 20:29:05 crc kubenswrapper[4744]: I1205 20:29:05.769578 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7dc867b75-npt5k" Dec 05 20:29:05 crc kubenswrapper[4744]: I1205 20:29:05.815334 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m" Dec 05 20:29:15 crc kubenswrapper[4744]: I1205 20:29:15.034104 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" event={"ID":"341519c2-107a-440a-bfbb-af937e0c681f","Type":"ContainerStarted","Data":"f43368b117d70cb44d2da3ac30003b2c6002b4cc2bdc98d3f3ff44ae0337a4d6"} Dec 05 20:29:15 crc kubenswrapper[4744]: I1205 20:29:15.034936 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:29:15 crc kubenswrapper[4744]: I1205 20:29:15.051667 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" podStartSLOduration=33.727450612 podStartE2EDuration="1m23.05164586s" podCreationTimestamp="2025-12-05 20:27:52 +0000 UTC" firstStartedPulling="2025-12-05 20:28:25.259387464 +0000 UTC m=+1075.489198832" lastFinishedPulling="2025-12-05 20:29:14.583582702 +0000 UTC m=+1124.813394080" observedRunningTime="2025-12-05 20:29:15.048856252 +0000 UTC m=+1125.278667620" watchObservedRunningTime="2025-12-05 20:29:15.05164586 +0000 UTC m=+1125.281457228" Dec 05 20:29:24 crc kubenswrapper[4744]: I1205 20:29:24.743687 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-5bfxc" Dec 05 20:29:29 crc kubenswrapper[4744]: I1205 20:29:29.897704 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl"] Dec 05 20:29:29 crc kubenswrapper[4744]: I1205 20:29:29.898183 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl" podUID="188e3fd8-70e3-485f-8c79-3f47c9a88474" containerName="operator" containerID="cri-o://7c262d47f923b5aacf87c3581e82912b3ee5eb445d12455638f6233a78ab5e57" gracePeriod=10 Dec 05 20:29:30 crc kubenswrapper[4744]: I1205 20:29:30.860444 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl" Dec 05 20:29:30 crc kubenswrapper[4744]: I1205 20:29:30.924577 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78xvl\" (UniqueName: \"kubernetes.io/projected/188e3fd8-70e3-485f-8c79-3f47c9a88474-kube-api-access-78xvl\") pod \"188e3fd8-70e3-485f-8c79-3f47c9a88474\" (UID: \"188e3fd8-70e3-485f-8c79-3f47c9a88474\") " Dec 05 20:29:30 crc kubenswrapper[4744]: I1205 20:29:30.934483 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/188e3fd8-70e3-485f-8c79-3f47c9a88474-kube-api-access-78xvl" (OuterVolumeSpecName: "kube-api-access-78xvl") pod "188e3fd8-70e3-485f-8c79-3f47c9a88474" (UID: "188e3fd8-70e3-485f-8c79-3f47c9a88474"). InnerVolumeSpecName "kube-api-access-78xvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.025884 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78xvl\" (UniqueName: \"kubernetes.io/projected/188e3fd8-70e3-485f-8c79-3f47c9a88474-kube-api-access-78xvl\") on node \"crc\" DevicePath \"\"" Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.170818 4744 generic.go:334] "Generic (PLEG): container finished" podID="188e3fd8-70e3-485f-8c79-3f47c9a88474" containerID="7c262d47f923b5aacf87c3581e82912b3ee5eb445d12455638f6233a78ab5e57" exitCode=0 Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.170869 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl" event={"ID":"188e3fd8-70e3-485f-8c79-3f47c9a88474","Type":"ContainerDied","Data":"7c262d47f923b5aacf87c3581e82912b3ee5eb445d12455638f6233a78ab5e57"} Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.170902 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl" event={"ID":"188e3fd8-70e3-485f-8c79-3f47c9a88474","Type":"ContainerDied","Data":"a9db3de2b622b2fad6590028a92349372d7fef93cc00b2f8acdb1c9cdf89f903"} Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.170926 4744 scope.go:117] "RemoveContainer" containerID="7c262d47f923b5aacf87c3581e82912b3ee5eb445d12455638f6233a78ab5e57" Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.171046 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl" Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.210122 4744 scope.go:117] "RemoveContainer" containerID="7c262d47f923b5aacf87c3581e82912b3ee5eb445d12455638f6233a78ab5e57" Dec 05 20:29:31 crc kubenswrapper[4744]: E1205 20:29:31.210648 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c262d47f923b5aacf87c3581e82912b3ee5eb445d12455638f6233a78ab5e57\": container with ID starting with 7c262d47f923b5aacf87c3581e82912b3ee5eb445d12455638f6233a78ab5e57 not found: ID does not exist" containerID="7c262d47f923b5aacf87c3581e82912b3ee5eb445d12455638f6233a78ab5e57" Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.210682 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c262d47f923b5aacf87c3581e82912b3ee5eb445d12455638f6233a78ab5e57"} err="failed to get container status \"7c262d47f923b5aacf87c3581e82912b3ee5eb445d12455638f6233a78ab5e57\": rpc error: code = NotFound desc = could not find container \"7c262d47f923b5aacf87c3581e82912b3ee5eb445d12455638f6233a78ab5e57\": container with ID starting with 7c262d47f923b5aacf87c3581e82912b3ee5eb445d12455638f6233a78ab5e57 not found: ID does not exist" Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.217729 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl"] Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.225206 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56699b584c-kpnbl"] Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.369640 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp"] Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.369899 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" podUID="9f0e37bb-46f6-45de-a562-ee1ce4d89c74" containerName="manager" containerID="cri-o://53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e" gracePeriod=10 Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.370044 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" podUID="9f0e37bb-46f6-45de-a562-ee1ce4d89c74" containerName="kube-rbac-proxy" containerID="cri-o://04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8" gracePeriod=10 Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.851905 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.938809 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6phz\" (UniqueName: \"kubernetes.io/projected/9f0e37bb-46f6-45de-a562-ee1ce4d89c74-kube-api-access-p6phz\") pod \"9f0e37bb-46f6-45de-a562-ee1ce4d89c74\" (UID: \"9f0e37bb-46f6-45de-a562-ee1ce4d89c74\") " Dec 05 20:29:31 crc kubenswrapper[4744]: I1205 20:29:31.958035 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f0e37bb-46f6-45de-a562-ee1ce4d89c74-kube-api-access-p6phz" (OuterVolumeSpecName: "kube-api-access-p6phz") pod "9f0e37bb-46f6-45de-a562-ee1ce4d89c74" (UID: "9f0e37bb-46f6-45de-a562-ee1ce4d89c74"). InnerVolumeSpecName "kube-api-access-p6phz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.040475 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6phz\" (UniqueName: \"kubernetes.io/projected/9f0e37bb-46f6-45de-a562-ee1ce4d89c74-kube-api-access-p6phz\") on node \"crc\" DevicePath \"\"" Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.093870 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="188e3fd8-70e3-485f-8c79-3f47c9a88474" path="/var/lib/kubelet/pods/188e3fd8-70e3-485f-8c79-3f47c9a88474/volumes" Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.182701 4744 generic.go:334] "Generic (PLEG): container finished" podID="9f0e37bb-46f6-45de-a562-ee1ce4d89c74" containerID="04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8" exitCode=0 Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.182764 4744 generic.go:334] "Generic (PLEG): container finished" podID="9f0e37bb-46f6-45de-a562-ee1ce4d89c74" containerID="53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e" exitCode=0 Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.182779 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.182789 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" event={"ID":"9f0e37bb-46f6-45de-a562-ee1ce4d89c74","Type":"ContainerDied","Data":"04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8"} Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.182847 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" event={"ID":"9f0e37bb-46f6-45de-a562-ee1ce4d89c74","Type":"ContainerDied","Data":"53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e"} Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.182862 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp" event={"ID":"9f0e37bb-46f6-45de-a562-ee1ce4d89c74","Type":"ContainerDied","Data":"f89d6968db54e8e8354ea65430841272db1bb0b5c30c786f02f3f8a999061a1f"} Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.182907 4744 scope.go:117] "RemoveContainer" containerID="04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8" Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.205263 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp"] Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.208027 4744 scope.go:117] "RemoveContainer" containerID="53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e" Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.213784 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f6cb9b975-hx6rp"] Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.222774 4744 scope.go:117] "RemoveContainer" containerID="04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8" Dec 05 20:29:32 crc kubenswrapper[4744]: E1205 20:29:32.223267 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8\": container with ID starting with 04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8 not found: ID does not exist" containerID="04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8" Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.223311 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8"} err="failed to get container status \"04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8\": rpc error: code = NotFound desc = could not find container \"04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8\": container with ID starting with 04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8 not found: ID does not exist" Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.223331 4744 scope.go:117] "RemoveContainer" containerID="53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e" Dec 05 20:29:32 crc kubenswrapper[4744]: E1205 20:29:32.223564 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e\": container with ID starting with 53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e not found: ID does not exist" containerID="53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e" Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.223593 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e"} err="failed to get container status \"53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e\": rpc error: code = NotFound desc = could not find container \"53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e\": container with ID starting with 53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e not found: ID does not exist" Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.223611 4744 scope.go:117] "RemoveContainer" containerID="04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8" Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.223860 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8"} err="failed to get container status \"04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8\": rpc error: code = NotFound desc = could not find container \"04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8\": container with ID starting with 04996dba1e2bedb5d76ca650f7b9c4e0d59ade83335a7042e20c8545b86adba8 not found: ID does not exist" Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.223883 4744 scope.go:117] "RemoveContainer" containerID="53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e" Dec 05 20:29:32 crc kubenswrapper[4744]: I1205 20:29:32.224247 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e"} err="failed to get container status \"53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e\": rpc error: code = NotFound desc = could not find container \"53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e\": container with ID starting with 53f5595543ff51ca97b9cdd281e79762c703037cce4c8abb003dafb3292de59e not found: ID does not exist" Dec 05 20:29:34 crc kubenswrapper[4744]: I1205 20:29:34.094804 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f0e37bb-46f6-45de-a562-ee1ce4d89c74" path="/var/lib/kubelet/pods/9f0e37bb-46f6-45de-a562-ee1ce4d89c74/volumes" Dec 05 20:29:34 crc kubenswrapper[4744]: I1205 20:29:34.465368 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-index-wm7qj"] Dec 05 20:29:34 crc kubenswrapper[4744]: E1205 20:29:34.465635 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0e37bb-46f6-45de-a562-ee1ce4d89c74" containerName="kube-rbac-proxy" Dec 05 20:29:34 crc kubenswrapper[4744]: I1205 20:29:34.465651 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0e37bb-46f6-45de-a562-ee1ce4d89c74" containerName="kube-rbac-proxy" Dec 05 20:29:34 crc kubenswrapper[4744]: E1205 20:29:34.465684 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188e3fd8-70e3-485f-8c79-3f47c9a88474" containerName="operator" Dec 05 20:29:34 crc kubenswrapper[4744]: I1205 20:29:34.465690 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="188e3fd8-70e3-485f-8c79-3f47c9a88474" containerName="operator" Dec 05 20:29:34 crc kubenswrapper[4744]: E1205 20:29:34.465714 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0e37bb-46f6-45de-a562-ee1ce4d89c74" containerName="manager" Dec 05 20:29:34 crc kubenswrapper[4744]: I1205 20:29:34.465720 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0e37bb-46f6-45de-a562-ee1ce4d89c74" containerName="manager" Dec 05 20:29:34 crc kubenswrapper[4744]: I1205 20:29:34.465847 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0e37bb-46f6-45de-a562-ee1ce4d89c74" containerName="manager" Dec 05 20:29:34 crc kubenswrapper[4744]: I1205 20:29:34.465864 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0e37bb-46f6-45de-a562-ee1ce4d89c74" containerName="kube-rbac-proxy" Dec 05 20:29:34 crc kubenswrapper[4744]: I1205 20:29:34.465877 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="188e3fd8-70e3-485f-8c79-3f47c9a88474" containerName="operator" Dec 05 20:29:34 crc kubenswrapper[4744]: I1205 20:29:34.466330 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-wm7qj" Dec 05 20:29:34 crc kubenswrapper[4744]: I1205 20:29:34.468205 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-index-dockercfg-pb7dj" Dec 05 20:29:34 crc kubenswrapper[4744]: I1205 20:29:34.476698 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-wm7qj"] Dec 05 20:29:34 crc kubenswrapper[4744]: I1205 20:29:34.577741 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-862zw\" (UniqueName: \"kubernetes.io/projected/680917e5-3185-4d6e-be14-d98398d65fde-kube-api-access-862zw\") pod \"watcher-operator-index-wm7qj\" (UID: \"680917e5-3185-4d6e-be14-d98398d65fde\") " pod="openstack-operators/watcher-operator-index-wm7qj" Dec 05 20:29:34 crc kubenswrapper[4744]: I1205 20:29:34.679671 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-862zw\" (UniqueName: \"kubernetes.io/projected/680917e5-3185-4d6e-be14-d98398d65fde-kube-api-access-862zw\") pod \"watcher-operator-index-wm7qj\" (UID: \"680917e5-3185-4d6e-be14-d98398d65fde\") " pod="openstack-operators/watcher-operator-index-wm7qj" Dec 05 20:29:34 crc kubenswrapper[4744]: I1205 20:29:34.719625 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-862zw\" (UniqueName: \"kubernetes.io/projected/680917e5-3185-4d6e-be14-d98398d65fde-kube-api-access-862zw\") pod \"watcher-operator-index-wm7qj\" (UID: \"680917e5-3185-4d6e-be14-d98398d65fde\") " pod="openstack-operators/watcher-operator-index-wm7qj" Dec 05 20:29:34 crc kubenswrapper[4744]: I1205 20:29:34.785036 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-wm7qj" Dec 05 20:29:35 crc kubenswrapper[4744]: I1205 20:29:35.257849 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-wm7qj"] Dec 05 20:29:36 crc kubenswrapper[4744]: I1205 20:29:36.220023 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-wm7qj" event={"ID":"680917e5-3185-4d6e-be14-d98398d65fde","Type":"ContainerStarted","Data":"1c8281e68727638aa2432ea5036799111be27549029a07ec0debf358ae312e95"} Dec 05 20:29:37 crc kubenswrapper[4744]: I1205 20:29:37.228859 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-wm7qj" event={"ID":"680917e5-3185-4d6e-be14-d98398d65fde","Type":"ContainerStarted","Data":"c2fbc38b8755d78b952774dc094e05ae6242091cdcf0c05fd2e089ac2aed2017"} Dec 05 20:29:37 crc kubenswrapper[4744]: I1205 20:29:37.244175 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-index-wm7qj" podStartSLOduration=2.475002011 podStartE2EDuration="3.244158502s" podCreationTimestamp="2025-12-05 20:29:34 +0000 UTC" firstStartedPulling="2025-12-05 20:29:35.27565608 +0000 UTC m=+1145.505467458" lastFinishedPulling="2025-12-05 20:29:36.044812581 +0000 UTC m=+1146.274623949" observedRunningTime="2025-12-05 20:29:37.241888225 +0000 UTC m=+1147.471699603" watchObservedRunningTime="2025-12-05 20:29:37.244158502 +0000 UTC m=+1147.473969880" Dec 05 20:29:38 crc kubenswrapper[4744]: I1205 20:29:38.856112 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-index-wm7qj"] Dec 05 20:29:39 crc kubenswrapper[4744]: I1205 20:29:39.245113 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-index-wm7qj" podUID="680917e5-3185-4d6e-be14-d98398d65fde" containerName="registry-server" containerID="cri-o://c2fbc38b8755d78b952774dc094e05ae6242091cdcf0c05fd2e089ac2aed2017" gracePeriod=2 Dec 05 20:29:39 crc kubenswrapper[4744]: I1205 20:29:39.469220 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-index-t4s2v"] Dec 05 20:29:39 crc kubenswrapper[4744]: I1205 20:29:39.470725 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-t4s2v" Dec 05 20:29:39 crc kubenswrapper[4744]: I1205 20:29:39.485193 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-t4s2v"] Dec 05 20:29:39 crc kubenswrapper[4744]: I1205 20:29:39.546675 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r89wp\" (UniqueName: \"kubernetes.io/projected/3018fddf-6d8b-437d-9a8b-dd585664b159-kube-api-access-r89wp\") pod \"watcher-operator-index-t4s2v\" (UID: \"3018fddf-6d8b-437d-9a8b-dd585664b159\") " pod="openstack-operators/watcher-operator-index-t4s2v" Dec 05 20:29:39 crc kubenswrapper[4744]: I1205 20:29:39.638665 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-wm7qj" Dec 05 20:29:39 crc kubenswrapper[4744]: I1205 20:29:39.648632 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r89wp\" (UniqueName: \"kubernetes.io/projected/3018fddf-6d8b-437d-9a8b-dd585664b159-kube-api-access-r89wp\") pod \"watcher-operator-index-t4s2v\" (UID: \"3018fddf-6d8b-437d-9a8b-dd585664b159\") " pod="openstack-operators/watcher-operator-index-t4s2v" Dec 05 20:29:39 crc kubenswrapper[4744]: I1205 20:29:39.671932 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r89wp\" (UniqueName: \"kubernetes.io/projected/3018fddf-6d8b-437d-9a8b-dd585664b159-kube-api-access-r89wp\") pod \"watcher-operator-index-t4s2v\" (UID: \"3018fddf-6d8b-437d-9a8b-dd585664b159\") " pod="openstack-operators/watcher-operator-index-t4s2v" Dec 05 20:29:39 crc kubenswrapper[4744]: I1205 20:29:39.750022 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-862zw\" (UniqueName: \"kubernetes.io/projected/680917e5-3185-4d6e-be14-d98398d65fde-kube-api-access-862zw\") pod \"680917e5-3185-4d6e-be14-d98398d65fde\" (UID: \"680917e5-3185-4d6e-be14-d98398d65fde\") " Dec 05 20:29:39 crc kubenswrapper[4744]: I1205 20:29:39.754815 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/680917e5-3185-4d6e-be14-d98398d65fde-kube-api-access-862zw" (OuterVolumeSpecName: "kube-api-access-862zw") pod "680917e5-3185-4d6e-be14-d98398d65fde" (UID: "680917e5-3185-4d6e-be14-d98398d65fde"). InnerVolumeSpecName "kube-api-access-862zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:29:39 crc kubenswrapper[4744]: I1205 20:29:39.802550 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-t4s2v" Dec 05 20:29:39 crc kubenswrapper[4744]: I1205 20:29:39.851783 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-862zw\" (UniqueName: \"kubernetes.io/projected/680917e5-3185-4d6e-be14-d98398d65fde-kube-api-access-862zw\") on node \"crc\" DevicePath \"\"" Dec 05 20:29:40 crc kubenswrapper[4744]: I1205 20:29:40.060594 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-t4s2v"] Dec 05 20:29:40 crc kubenswrapper[4744]: I1205 20:29:40.275126 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-t4s2v" event={"ID":"3018fddf-6d8b-437d-9a8b-dd585664b159","Type":"ContainerStarted","Data":"41e9dc46666c9ff20e0e4890b670bc8efe1948874c3e747016f5bdd4de70fa2e"} Dec 05 20:29:40 crc kubenswrapper[4744]: I1205 20:29:40.279344 4744 generic.go:334] "Generic (PLEG): container finished" podID="680917e5-3185-4d6e-be14-d98398d65fde" containerID="c2fbc38b8755d78b952774dc094e05ae6242091cdcf0c05fd2e089ac2aed2017" exitCode=0 Dec 05 20:29:40 crc kubenswrapper[4744]: I1205 20:29:40.279432 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-wm7qj" event={"ID":"680917e5-3185-4d6e-be14-d98398d65fde","Type":"ContainerDied","Data":"c2fbc38b8755d78b952774dc094e05ae6242091cdcf0c05fd2e089ac2aed2017"} Dec 05 20:29:40 crc kubenswrapper[4744]: I1205 20:29:40.279514 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-wm7qj" Dec 05 20:29:40 crc kubenswrapper[4744]: I1205 20:29:40.279718 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-wm7qj" event={"ID":"680917e5-3185-4d6e-be14-d98398d65fde","Type":"ContainerDied","Data":"1c8281e68727638aa2432ea5036799111be27549029a07ec0debf358ae312e95"} Dec 05 20:29:40 crc kubenswrapper[4744]: I1205 20:29:40.279736 4744 scope.go:117] "RemoveContainer" containerID="c2fbc38b8755d78b952774dc094e05ae6242091cdcf0c05fd2e089ac2aed2017" Dec 05 20:29:40 crc kubenswrapper[4744]: I1205 20:29:40.303810 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-index-wm7qj"] Dec 05 20:29:40 crc kubenswrapper[4744]: I1205 20:29:40.309436 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-index-wm7qj"] Dec 05 20:29:40 crc kubenswrapper[4744]: I1205 20:29:40.311525 4744 scope.go:117] "RemoveContainer" containerID="c2fbc38b8755d78b952774dc094e05ae6242091cdcf0c05fd2e089ac2aed2017" Dec 05 20:29:40 crc kubenswrapper[4744]: E1205 20:29:40.311920 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2fbc38b8755d78b952774dc094e05ae6242091cdcf0c05fd2e089ac2aed2017\": container with ID starting with c2fbc38b8755d78b952774dc094e05ae6242091cdcf0c05fd2e089ac2aed2017 not found: ID does not exist" containerID="c2fbc38b8755d78b952774dc094e05ae6242091cdcf0c05fd2e089ac2aed2017" Dec 05 20:29:40 crc kubenswrapper[4744]: I1205 20:29:40.311951 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2fbc38b8755d78b952774dc094e05ae6242091cdcf0c05fd2e089ac2aed2017"} err="failed to get container status \"c2fbc38b8755d78b952774dc094e05ae6242091cdcf0c05fd2e089ac2aed2017\": rpc error: code = NotFound desc = could not find container \"c2fbc38b8755d78b952774dc094e05ae6242091cdcf0c05fd2e089ac2aed2017\": container with ID starting with c2fbc38b8755d78b952774dc094e05ae6242091cdcf0c05fd2e089ac2aed2017 not found: ID does not exist" Dec 05 20:29:41 crc kubenswrapper[4744]: I1205 20:29:41.291471 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-t4s2v" event={"ID":"3018fddf-6d8b-437d-9a8b-dd585664b159","Type":"ContainerStarted","Data":"1c669c54a027b5d487aff743611ecf06a779c44dc9ec0793659514694a44c157"} Dec 05 20:29:41 crc kubenswrapper[4744]: I1205 20:29:41.311068 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-index-t4s2v" podStartSLOduration=1.9502066770000002 podStartE2EDuration="2.311043665s" podCreationTimestamp="2025-12-05 20:29:39 +0000 UTC" firstStartedPulling="2025-12-05 20:29:40.070206678 +0000 UTC m=+1150.300018046" lastFinishedPulling="2025-12-05 20:29:40.431043666 +0000 UTC m=+1150.660855034" observedRunningTime="2025-12-05 20:29:41.309920707 +0000 UTC m=+1151.539732085" watchObservedRunningTime="2025-12-05 20:29:41.311043665 +0000 UTC m=+1151.540855043" Dec 05 20:29:42 crc kubenswrapper[4744]: I1205 20:29:42.092675 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="680917e5-3185-4d6e-be14-d98398d65fde" path="/var/lib/kubelet/pods/680917e5-3185-4d6e-be14-d98398d65fde/volumes" Dec 05 20:29:49 crc kubenswrapper[4744]: I1205 20:29:49.803659 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/watcher-operator-index-t4s2v" Dec 05 20:29:49 crc kubenswrapper[4744]: I1205 20:29:49.805877 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-index-t4s2v" Dec 05 20:29:49 crc kubenswrapper[4744]: I1205 20:29:49.877946 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/watcher-operator-index-t4s2v" Dec 05 20:29:50 crc kubenswrapper[4744]: I1205 20:29:50.398461 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-index-t4s2v" Dec 05 20:29:52 crc kubenswrapper[4744]: I1205 20:29:52.915638 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq"] Dec 05 20:29:52 crc kubenswrapper[4744]: E1205 20:29:52.916132 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="680917e5-3185-4d6e-be14-d98398d65fde" containerName="registry-server" Dec 05 20:29:52 crc kubenswrapper[4744]: I1205 20:29:52.916143 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="680917e5-3185-4d6e-be14-d98398d65fde" containerName="registry-server" Dec 05 20:29:52 crc kubenswrapper[4744]: I1205 20:29:52.916271 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="680917e5-3185-4d6e-be14-d98398d65fde" containerName="registry-server" Dec 05 20:29:52 crc kubenswrapper[4744]: I1205 20:29:52.917210 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" Dec 05 20:29:52 crc kubenswrapper[4744]: I1205 20:29:52.920011 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-w9k4r" Dec 05 20:29:52 crc kubenswrapper[4744]: I1205 20:29:52.924637 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq"] Dec 05 20:29:53 crc kubenswrapper[4744]: I1205 20:29:53.037430 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-bundle\") pod \"d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq\" (UID: \"81b7a1b4-a09c-4c8c-841d-5a8d8deed699\") " pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" Dec 05 20:29:53 crc kubenswrapper[4744]: I1205 20:29:53.037501 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-util\") pod \"d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq\" (UID: \"81b7a1b4-a09c-4c8c-841d-5a8d8deed699\") " pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" Dec 05 20:29:53 crc kubenswrapper[4744]: I1205 20:29:53.037592 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs5dg\" (UniqueName: \"kubernetes.io/projected/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-kube-api-access-bs5dg\") pod \"d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq\" (UID: \"81b7a1b4-a09c-4c8c-841d-5a8d8deed699\") " pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" Dec 05 20:29:53 crc kubenswrapper[4744]: I1205 20:29:53.138893 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-util\") pod \"d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq\" (UID: \"81b7a1b4-a09c-4c8c-841d-5a8d8deed699\") " pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" Dec 05 20:29:53 crc kubenswrapper[4744]: I1205 20:29:53.138982 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs5dg\" (UniqueName: \"kubernetes.io/projected/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-kube-api-access-bs5dg\") pod \"d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq\" (UID: \"81b7a1b4-a09c-4c8c-841d-5a8d8deed699\") " pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" Dec 05 20:29:53 crc kubenswrapper[4744]: I1205 20:29:53.139079 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-bundle\") pod \"d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq\" (UID: \"81b7a1b4-a09c-4c8c-841d-5a8d8deed699\") " pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" Dec 05 20:29:53 crc kubenswrapper[4744]: I1205 20:29:53.139455 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-util\") pod \"d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq\" (UID: \"81b7a1b4-a09c-4c8c-841d-5a8d8deed699\") " pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" Dec 05 20:29:53 crc kubenswrapper[4744]: I1205 20:29:53.139570 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-bundle\") pod \"d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq\" (UID: \"81b7a1b4-a09c-4c8c-841d-5a8d8deed699\") " pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" Dec 05 20:29:53 crc kubenswrapper[4744]: I1205 20:29:53.161430 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs5dg\" (UniqueName: \"kubernetes.io/projected/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-kube-api-access-bs5dg\") pod \"d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq\" (UID: \"81b7a1b4-a09c-4c8c-841d-5a8d8deed699\") " pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" Dec 05 20:29:53 crc kubenswrapper[4744]: I1205 20:29:53.247538 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" Dec 05 20:29:53 crc kubenswrapper[4744]: I1205 20:29:53.695891 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq"] Dec 05 20:29:54 crc kubenswrapper[4744]: I1205 20:29:54.397992 4744 generic.go:334] "Generic (PLEG): container finished" podID="81b7a1b4-a09c-4c8c-841d-5a8d8deed699" containerID="420d1aee01dfa12f7ebc5354443c2576322ead1c62e9f2ec16b82a1de6fa9268" exitCode=0 Dec 05 20:29:54 crc kubenswrapper[4744]: I1205 20:29:54.398062 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" event={"ID":"81b7a1b4-a09c-4c8c-841d-5a8d8deed699","Type":"ContainerDied","Data":"420d1aee01dfa12f7ebc5354443c2576322ead1c62e9f2ec16b82a1de6fa9268"} Dec 05 20:29:54 crc kubenswrapper[4744]: I1205 20:29:54.398138 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" event={"ID":"81b7a1b4-a09c-4c8c-841d-5a8d8deed699","Type":"ContainerStarted","Data":"c7d317572fe55ff19b190ee0f619b2322a2f000cd4ea01958b48bb6cdc2bacc8"} Dec 05 20:29:55 crc kubenswrapper[4744]: I1205 20:29:55.407096 4744 generic.go:334] "Generic (PLEG): container finished" podID="81b7a1b4-a09c-4c8c-841d-5a8d8deed699" containerID="2de915865c57b4d5bfffb20b84eaae4040bcc00ce3e5e63aac11f108b28afbc1" exitCode=0 Dec 05 20:29:55 crc kubenswrapper[4744]: I1205 20:29:55.407188 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" event={"ID":"81b7a1b4-a09c-4c8c-841d-5a8d8deed699","Type":"ContainerDied","Data":"2de915865c57b4d5bfffb20b84eaae4040bcc00ce3e5e63aac11f108b28afbc1"} Dec 05 20:29:56 crc kubenswrapper[4744]: I1205 20:29:56.418189 4744 generic.go:334] "Generic (PLEG): container finished" podID="81b7a1b4-a09c-4c8c-841d-5a8d8deed699" containerID="6f760d6fb95c8eaafbf64d90332f27315bbd908869f0371bf541f2a0db58e336" exitCode=0 Dec 05 20:29:56 crc kubenswrapper[4744]: I1205 20:29:56.418335 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" event={"ID":"81b7a1b4-a09c-4c8c-841d-5a8d8deed699","Type":"ContainerDied","Data":"6f760d6fb95c8eaafbf64d90332f27315bbd908869f0371bf541f2a0db58e336"} Dec 05 20:29:57 crc kubenswrapper[4744]: I1205 20:29:57.712996 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" Dec 05 20:29:57 crc kubenswrapper[4744]: I1205 20:29:57.805265 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-util\") pod \"81b7a1b4-a09c-4c8c-841d-5a8d8deed699\" (UID: \"81b7a1b4-a09c-4c8c-841d-5a8d8deed699\") " Dec 05 20:29:57 crc kubenswrapper[4744]: I1205 20:29:57.805668 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-bundle\") pod \"81b7a1b4-a09c-4c8c-841d-5a8d8deed699\" (UID: \"81b7a1b4-a09c-4c8c-841d-5a8d8deed699\") " Dec 05 20:29:57 crc kubenswrapper[4744]: I1205 20:29:57.805774 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs5dg\" (UniqueName: \"kubernetes.io/projected/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-kube-api-access-bs5dg\") pod \"81b7a1b4-a09c-4c8c-841d-5a8d8deed699\" (UID: \"81b7a1b4-a09c-4c8c-841d-5a8d8deed699\") " Dec 05 20:29:57 crc kubenswrapper[4744]: I1205 20:29:57.806618 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-bundle" (OuterVolumeSpecName: "bundle") pod "81b7a1b4-a09c-4c8c-841d-5a8d8deed699" (UID: "81b7a1b4-a09c-4c8c-841d-5a8d8deed699"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:29:57 crc kubenswrapper[4744]: I1205 20:29:57.810316 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-kube-api-access-bs5dg" (OuterVolumeSpecName: "kube-api-access-bs5dg") pod "81b7a1b4-a09c-4c8c-841d-5a8d8deed699" (UID: "81b7a1b4-a09c-4c8c-841d-5a8d8deed699"). InnerVolumeSpecName "kube-api-access-bs5dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:29:57 crc kubenswrapper[4744]: I1205 20:29:57.827261 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-util" (OuterVolumeSpecName: "util") pod "81b7a1b4-a09c-4c8c-841d-5a8d8deed699" (UID: "81b7a1b4-a09c-4c8c-841d-5a8d8deed699"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:29:57 crc kubenswrapper[4744]: I1205 20:29:57.907171 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs5dg\" (UniqueName: \"kubernetes.io/projected/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-kube-api-access-bs5dg\") on node \"crc\" DevicePath \"\"" Dec 05 20:29:57 crc kubenswrapper[4744]: I1205 20:29:57.907211 4744 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-util\") on node \"crc\" DevicePath \"\"" Dec 05 20:29:57 crc kubenswrapper[4744]: I1205 20:29:57.907224 4744 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81b7a1b4-a09c-4c8c-841d-5a8d8deed699-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:29:58 crc kubenswrapper[4744]: I1205 20:29:58.437713 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" event={"ID":"81b7a1b4-a09c-4c8c-841d-5a8d8deed699","Type":"ContainerDied","Data":"c7d317572fe55ff19b190ee0f619b2322a2f000cd4ea01958b48bb6cdc2bacc8"} Dec 05 20:29:58 crc kubenswrapper[4744]: I1205 20:29:58.437757 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7d317572fe55ff19b190ee0f619b2322a2f000cd4ea01958b48bb6cdc2bacc8" Dec 05 20:29:58 crc kubenswrapper[4744]: I1205 20:29:58.438263 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.134128 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz"] Dec 05 20:30:00 crc kubenswrapper[4744]: E1205 20:30:00.135154 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b7a1b4-a09c-4c8c-841d-5a8d8deed699" containerName="extract" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.135171 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b7a1b4-a09c-4c8c-841d-5a8d8deed699" containerName="extract" Dec 05 20:30:00 crc kubenswrapper[4744]: E1205 20:30:00.135197 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b7a1b4-a09c-4c8c-841d-5a8d8deed699" containerName="util" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.135206 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b7a1b4-a09c-4c8c-841d-5a8d8deed699" containerName="util" Dec 05 20:30:00 crc kubenswrapper[4744]: E1205 20:30:00.135233 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b7a1b4-a09c-4c8c-841d-5a8d8deed699" containerName="pull" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.135241 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b7a1b4-a09c-4c8c-841d-5a8d8deed699" containerName="pull" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.135453 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="81b7a1b4-a09c-4c8c-841d-5a8d8deed699" containerName="extract" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.136014 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.140974 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.141422 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.143974 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz"] Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.238739 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhdt6\" (UniqueName: \"kubernetes.io/projected/1d0723d2-a725-4d0a-b93f-562aef05d491-kube-api-access-nhdt6\") pod \"collect-profiles-29416110-spfdz\" (UID: \"1d0723d2-a725-4d0a-b93f-562aef05d491\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.238866 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d0723d2-a725-4d0a-b93f-562aef05d491-config-volume\") pod \"collect-profiles-29416110-spfdz\" (UID: \"1d0723d2-a725-4d0a-b93f-562aef05d491\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.238900 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d0723d2-a725-4d0a-b93f-562aef05d491-secret-volume\") pod \"collect-profiles-29416110-spfdz\" (UID: \"1d0723d2-a725-4d0a-b93f-562aef05d491\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.339882 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhdt6\" (UniqueName: \"kubernetes.io/projected/1d0723d2-a725-4d0a-b93f-562aef05d491-kube-api-access-nhdt6\") pod \"collect-profiles-29416110-spfdz\" (UID: \"1d0723d2-a725-4d0a-b93f-562aef05d491\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.339951 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d0723d2-a725-4d0a-b93f-562aef05d491-config-volume\") pod \"collect-profiles-29416110-spfdz\" (UID: \"1d0723d2-a725-4d0a-b93f-562aef05d491\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.339973 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d0723d2-a725-4d0a-b93f-562aef05d491-secret-volume\") pod \"collect-profiles-29416110-spfdz\" (UID: \"1d0723d2-a725-4d0a-b93f-562aef05d491\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.341057 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d0723d2-a725-4d0a-b93f-562aef05d491-config-volume\") pod \"collect-profiles-29416110-spfdz\" (UID: \"1d0723d2-a725-4d0a-b93f-562aef05d491\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.346393 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d0723d2-a725-4d0a-b93f-562aef05d491-secret-volume\") pod \"collect-profiles-29416110-spfdz\" (UID: \"1d0723d2-a725-4d0a-b93f-562aef05d491\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.362354 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhdt6\" (UniqueName: \"kubernetes.io/projected/1d0723d2-a725-4d0a-b93f-562aef05d491-kube-api-access-nhdt6\") pod \"collect-profiles-29416110-spfdz\" (UID: \"1d0723d2-a725-4d0a-b93f-562aef05d491\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.456154 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" Dec 05 20:30:00 crc kubenswrapper[4744]: I1205 20:30:00.889897 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz"] Dec 05 20:30:00 crc kubenswrapper[4744]: W1205 20:30:00.893794 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d0723d2_a725_4d0a_b93f_562aef05d491.slice/crio-88fb20f50aa78b63bc37150e0e35b547a2ea9cecf3eb97efdd2c449b7596dcb2 WatchSource:0}: Error finding container 88fb20f50aa78b63bc37150e0e35b547a2ea9cecf3eb97efdd2c449b7596dcb2: Status 404 returned error can't find the container with id 88fb20f50aa78b63bc37150e0e35b547a2ea9cecf3eb97efdd2c449b7596dcb2 Dec 05 20:30:01 crc kubenswrapper[4744]: I1205 20:30:01.461373 4744 generic.go:334] "Generic (PLEG): container finished" podID="1d0723d2-a725-4d0a-b93f-562aef05d491" containerID="3767248a72b9e297f57ca8745806d04bd047be46c429e1f727563fc502469f1c" exitCode=0 Dec 05 20:30:01 crc kubenswrapper[4744]: I1205 20:30:01.461440 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" event={"ID":"1d0723d2-a725-4d0a-b93f-562aef05d491","Type":"ContainerDied","Data":"3767248a72b9e297f57ca8745806d04bd047be46c429e1f727563fc502469f1c"} Dec 05 20:30:01 crc kubenswrapper[4744]: I1205 20:30:01.461736 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" event={"ID":"1d0723d2-a725-4d0a-b93f-562aef05d491","Type":"ContainerStarted","Data":"88fb20f50aa78b63bc37150e0e35b547a2ea9cecf3eb97efdd2c449b7596dcb2"} Dec 05 20:30:02 crc kubenswrapper[4744]: I1205 20:30:02.742787 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" Dec 05 20:30:02 crc kubenswrapper[4744]: I1205 20:30:02.883766 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d0723d2-a725-4d0a-b93f-562aef05d491-config-volume\") pod \"1d0723d2-a725-4d0a-b93f-562aef05d491\" (UID: \"1d0723d2-a725-4d0a-b93f-562aef05d491\") " Dec 05 20:30:02 crc kubenswrapper[4744]: I1205 20:30:02.883840 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhdt6\" (UniqueName: \"kubernetes.io/projected/1d0723d2-a725-4d0a-b93f-562aef05d491-kube-api-access-nhdt6\") pod \"1d0723d2-a725-4d0a-b93f-562aef05d491\" (UID: \"1d0723d2-a725-4d0a-b93f-562aef05d491\") " Dec 05 20:30:02 crc kubenswrapper[4744]: I1205 20:30:02.883902 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d0723d2-a725-4d0a-b93f-562aef05d491-secret-volume\") pod \"1d0723d2-a725-4d0a-b93f-562aef05d491\" (UID: \"1d0723d2-a725-4d0a-b93f-562aef05d491\") " Dec 05 20:30:02 crc kubenswrapper[4744]: I1205 20:30:02.884350 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d0723d2-a725-4d0a-b93f-562aef05d491-config-volume" (OuterVolumeSpecName: "config-volume") pod "1d0723d2-a725-4d0a-b93f-562aef05d491" (UID: "1d0723d2-a725-4d0a-b93f-562aef05d491"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:30:02 crc kubenswrapper[4744]: I1205 20:30:02.890113 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d0723d2-a725-4d0a-b93f-562aef05d491-kube-api-access-nhdt6" (OuterVolumeSpecName: "kube-api-access-nhdt6") pod "1d0723d2-a725-4d0a-b93f-562aef05d491" (UID: "1d0723d2-a725-4d0a-b93f-562aef05d491"). InnerVolumeSpecName "kube-api-access-nhdt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:30:02 crc kubenswrapper[4744]: I1205 20:30:02.890115 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0723d2-a725-4d0a-b93f-562aef05d491-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1d0723d2-a725-4d0a-b93f-562aef05d491" (UID: "1d0723d2-a725-4d0a-b93f-562aef05d491"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:30:02 crc kubenswrapper[4744]: I1205 20:30:02.985773 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d0723d2-a725-4d0a-b93f-562aef05d491-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:02 crc kubenswrapper[4744]: I1205 20:30:02.985824 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhdt6\" (UniqueName: \"kubernetes.io/projected/1d0723d2-a725-4d0a-b93f-562aef05d491-kube-api-access-nhdt6\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:02 crc kubenswrapper[4744]: I1205 20:30:02.985839 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d0723d2-a725-4d0a-b93f-562aef05d491-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.476780 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" event={"ID":"1d0723d2-a725-4d0a-b93f-562aef05d491","Type":"ContainerDied","Data":"88fb20f50aa78b63bc37150e0e35b547a2ea9cecf3eb97efdd2c449b7596dcb2"} Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.476815 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88fb20f50aa78b63bc37150e0e35b547a2ea9cecf3eb97efdd2c449b7596dcb2" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.476862 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416110-spfdz" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.561690 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj"] Dec 05 20:30:03 crc kubenswrapper[4744]: E1205 20:30:03.562016 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d0723d2-a725-4d0a-b93f-562aef05d491" containerName="collect-profiles" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.562034 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0723d2-a725-4d0a-b93f-562aef05d491" containerName="collect-profiles" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.562171 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d0723d2-a725-4d0a-b93f-562aef05d491" containerName="collect-profiles" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.562640 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.565607 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-service-cert" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.565607 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5ljb9" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.572755 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj"] Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.694727 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/508527be-df29-4021-af61-ecd106e8ec8d-apiservice-cert\") pod \"watcher-operator-controller-manager-d8cd4495-tf7zj\" (UID: \"508527be-df29-4021-af61-ecd106e8ec8d\") " pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.694784 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvmv9\" (UniqueName: \"kubernetes.io/projected/508527be-df29-4021-af61-ecd106e8ec8d-kube-api-access-zvmv9\") pod \"watcher-operator-controller-manager-d8cd4495-tf7zj\" (UID: \"508527be-df29-4021-af61-ecd106e8ec8d\") " pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.694998 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/508527be-df29-4021-af61-ecd106e8ec8d-webhook-cert\") pod \"watcher-operator-controller-manager-d8cd4495-tf7zj\" (UID: \"508527be-df29-4021-af61-ecd106e8ec8d\") " pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.796631 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/508527be-df29-4021-af61-ecd106e8ec8d-webhook-cert\") pod \"watcher-operator-controller-manager-d8cd4495-tf7zj\" (UID: \"508527be-df29-4021-af61-ecd106e8ec8d\") " pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.796712 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/508527be-df29-4021-af61-ecd106e8ec8d-apiservice-cert\") pod \"watcher-operator-controller-manager-d8cd4495-tf7zj\" (UID: \"508527be-df29-4021-af61-ecd106e8ec8d\") " pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.796762 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvmv9\" (UniqueName: \"kubernetes.io/projected/508527be-df29-4021-af61-ecd106e8ec8d-kube-api-access-zvmv9\") pod \"watcher-operator-controller-manager-d8cd4495-tf7zj\" (UID: \"508527be-df29-4021-af61-ecd106e8ec8d\") " pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.803935 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/508527be-df29-4021-af61-ecd106e8ec8d-webhook-cert\") pod \"watcher-operator-controller-manager-d8cd4495-tf7zj\" (UID: \"508527be-df29-4021-af61-ecd106e8ec8d\") " pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.804022 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/508527be-df29-4021-af61-ecd106e8ec8d-apiservice-cert\") pod \"watcher-operator-controller-manager-d8cd4495-tf7zj\" (UID: \"508527be-df29-4021-af61-ecd106e8ec8d\") " pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.819693 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvmv9\" (UniqueName: \"kubernetes.io/projected/508527be-df29-4021-af61-ecd106e8ec8d-kube-api-access-zvmv9\") pod \"watcher-operator-controller-manager-d8cd4495-tf7zj\" (UID: \"508527be-df29-4021-af61-ecd106e8ec8d\") " pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" Dec 05 20:30:03 crc kubenswrapper[4744]: I1205 20:30:03.879230 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" Dec 05 20:30:04 crc kubenswrapper[4744]: I1205 20:30:04.300573 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj"] Dec 05 20:30:04 crc kubenswrapper[4744]: W1205 20:30:04.307103 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod508527be_df29_4021_af61_ecd106e8ec8d.slice/crio-879fc5a75210426918b152d4d3a9173a62127ec384fe4bad9725c4f28b1a5bff WatchSource:0}: Error finding container 879fc5a75210426918b152d4d3a9173a62127ec384fe4bad9725c4f28b1a5bff: Status 404 returned error can't find the container with id 879fc5a75210426918b152d4d3a9173a62127ec384fe4bad9725c4f28b1a5bff Dec 05 20:30:04 crc kubenswrapper[4744]: I1205 20:30:04.484906 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" event={"ID":"508527be-df29-4021-af61-ecd106e8ec8d","Type":"ContainerStarted","Data":"6096f9f06b861a26c5c2e7ef95d01618c68c6d9994bb3a0c406f1efc1a92f572"} Dec 05 20:30:04 crc kubenswrapper[4744]: I1205 20:30:04.485230 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" event={"ID":"508527be-df29-4021-af61-ecd106e8ec8d","Type":"ContainerStarted","Data":"879fc5a75210426918b152d4d3a9173a62127ec384fe4bad9725c4f28b1a5bff"} Dec 05 20:30:04 crc kubenswrapper[4744]: I1205 20:30:04.485257 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" Dec 05 20:30:04 crc kubenswrapper[4744]: I1205 20:30:04.510738 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" podStartSLOduration=1.510717117 podStartE2EDuration="1.510717117s" podCreationTimestamp="2025-12-05 20:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:30:04.503357696 +0000 UTC m=+1174.733169074" watchObservedRunningTime="2025-12-05 20:30:04.510717117 +0000 UTC m=+1174.740528495" Dec 05 20:30:13 crc kubenswrapper[4744]: I1205 20:30:13.884081 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" Dec 05 20:30:15 crc kubenswrapper[4744]: I1205 20:30:15.571637 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl"] Dec 05 20:30:15 crc kubenswrapper[4744]: I1205 20:30:15.572554 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" Dec 05 20:30:15 crc kubenswrapper[4744]: I1205 20:30:15.589760 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl"] Dec 05 20:30:15 crc kubenswrapper[4744]: I1205 20:30:15.667231 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab409bd9-6116-4c63-b990-0bdf2214420a-apiservice-cert\") pod \"watcher-operator-controller-manager-6dd9866b7f-kbvtl\" (UID: \"ab409bd9-6116-4c63-b990-0bdf2214420a\") " pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" Dec 05 20:30:15 crc kubenswrapper[4744]: I1205 20:30:15.667282 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab409bd9-6116-4c63-b990-0bdf2214420a-webhook-cert\") pod \"watcher-operator-controller-manager-6dd9866b7f-kbvtl\" (UID: \"ab409bd9-6116-4c63-b990-0bdf2214420a\") " pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" Dec 05 20:30:15 crc kubenswrapper[4744]: I1205 20:30:15.667342 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6x25\" (UniqueName: \"kubernetes.io/projected/ab409bd9-6116-4c63-b990-0bdf2214420a-kube-api-access-b6x25\") pod \"watcher-operator-controller-manager-6dd9866b7f-kbvtl\" (UID: \"ab409bd9-6116-4c63-b990-0bdf2214420a\") " pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" Dec 05 20:30:15 crc kubenswrapper[4744]: I1205 20:30:15.768764 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab409bd9-6116-4c63-b990-0bdf2214420a-apiservice-cert\") pod \"watcher-operator-controller-manager-6dd9866b7f-kbvtl\" (UID: \"ab409bd9-6116-4c63-b990-0bdf2214420a\") " pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" Dec 05 20:30:15 crc kubenswrapper[4744]: I1205 20:30:15.768800 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab409bd9-6116-4c63-b990-0bdf2214420a-webhook-cert\") pod \"watcher-operator-controller-manager-6dd9866b7f-kbvtl\" (UID: \"ab409bd9-6116-4c63-b990-0bdf2214420a\") " pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" Dec 05 20:30:15 crc kubenswrapper[4744]: I1205 20:30:15.768833 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6x25\" (UniqueName: \"kubernetes.io/projected/ab409bd9-6116-4c63-b990-0bdf2214420a-kube-api-access-b6x25\") pod \"watcher-operator-controller-manager-6dd9866b7f-kbvtl\" (UID: \"ab409bd9-6116-4c63-b990-0bdf2214420a\") " pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" Dec 05 20:30:15 crc kubenswrapper[4744]: I1205 20:30:15.774762 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab409bd9-6116-4c63-b990-0bdf2214420a-webhook-cert\") pod \"watcher-operator-controller-manager-6dd9866b7f-kbvtl\" (UID: \"ab409bd9-6116-4c63-b990-0bdf2214420a\") " pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" Dec 05 20:30:15 crc kubenswrapper[4744]: I1205 20:30:15.776027 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab409bd9-6116-4c63-b990-0bdf2214420a-apiservice-cert\") pod \"watcher-operator-controller-manager-6dd9866b7f-kbvtl\" (UID: \"ab409bd9-6116-4c63-b990-0bdf2214420a\") " pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" Dec 05 20:30:15 crc kubenswrapper[4744]: I1205 20:30:15.797505 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6x25\" (UniqueName: \"kubernetes.io/projected/ab409bd9-6116-4c63-b990-0bdf2214420a-kube-api-access-b6x25\") pod \"watcher-operator-controller-manager-6dd9866b7f-kbvtl\" (UID: \"ab409bd9-6116-4c63-b990-0bdf2214420a\") " pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" Dec 05 20:30:15 crc kubenswrapper[4744]: I1205 20:30:15.897988 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" Dec 05 20:30:16 crc kubenswrapper[4744]: I1205 20:30:16.363499 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl"] Dec 05 20:30:16 crc kubenswrapper[4744]: I1205 20:30:16.581700 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" event={"ID":"ab409bd9-6116-4c63-b990-0bdf2214420a","Type":"ContainerStarted","Data":"38016a792f832fb51f15dc0e46db4d16db158fd312c5018dce4397dc26c7e23e"} Dec 05 20:30:16 crc kubenswrapper[4744]: I1205 20:30:16.581750 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" event={"ID":"ab409bd9-6116-4c63-b990-0bdf2214420a","Type":"ContainerStarted","Data":"2997ff9e6a3309acf069aa6c2f2ecb9b7210dcedfab42b5d34dfe2e03e2517b9"} Dec 05 20:30:16 crc kubenswrapper[4744]: I1205 20:30:16.581906 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" Dec 05 20:30:16 crc kubenswrapper[4744]: I1205 20:30:16.611644 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" podStartSLOduration=1.611622543 podStartE2EDuration="1.611622543s" podCreationTimestamp="2025-12-05 20:30:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:30:16.60506505 +0000 UTC m=+1186.834876438" watchObservedRunningTime="2025-12-05 20:30:16.611622543 +0000 UTC m=+1186.841433911" Dec 05 20:30:25 crc kubenswrapper[4744]: I1205 20:30:25.901820 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd9866b7f-kbvtl" Dec 05 20:30:25 crc kubenswrapper[4744]: I1205 20:30:25.956916 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj"] Dec 05 20:30:25 crc kubenswrapper[4744]: I1205 20:30:25.957164 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" podUID="508527be-df29-4021-af61-ecd106e8ec8d" containerName="manager" containerID="cri-o://6096f9f06b861a26c5c2e7ef95d01618c68c6d9994bb3a0c406f1efc1a92f572" gracePeriod=10 Dec 05 20:30:29 crc kubenswrapper[4744]: I1205 20:30:29.704560 4744 generic.go:334] "Generic (PLEG): container finished" podID="508527be-df29-4021-af61-ecd106e8ec8d" containerID="6096f9f06b861a26c5c2e7ef95d01618c68c6d9994bb3a0c406f1efc1a92f572" exitCode=0 Dec 05 20:30:29 crc kubenswrapper[4744]: I1205 20:30:29.704669 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" event={"ID":"508527be-df29-4021-af61-ecd106e8ec8d","Type":"ContainerDied","Data":"6096f9f06b861a26c5c2e7ef95d01618c68c6d9994bb3a0c406f1efc1a92f572"} Dec 05 20:30:30 crc kubenswrapper[4744]: I1205 20:30:30.155260 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" Dec 05 20:30:30 crc kubenswrapper[4744]: I1205 20:30:30.273480 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvmv9\" (UniqueName: \"kubernetes.io/projected/508527be-df29-4021-af61-ecd106e8ec8d-kube-api-access-zvmv9\") pod \"508527be-df29-4021-af61-ecd106e8ec8d\" (UID: \"508527be-df29-4021-af61-ecd106e8ec8d\") " Dec 05 20:30:30 crc kubenswrapper[4744]: I1205 20:30:30.273534 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/508527be-df29-4021-af61-ecd106e8ec8d-webhook-cert\") pod \"508527be-df29-4021-af61-ecd106e8ec8d\" (UID: \"508527be-df29-4021-af61-ecd106e8ec8d\") " Dec 05 20:30:30 crc kubenswrapper[4744]: I1205 20:30:30.273551 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/508527be-df29-4021-af61-ecd106e8ec8d-apiservice-cert\") pod \"508527be-df29-4021-af61-ecd106e8ec8d\" (UID: \"508527be-df29-4021-af61-ecd106e8ec8d\") " Dec 05 20:30:30 crc kubenswrapper[4744]: I1205 20:30:30.278666 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/508527be-df29-4021-af61-ecd106e8ec8d-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "508527be-df29-4021-af61-ecd106e8ec8d" (UID: "508527be-df29-4021-af61-ecd106e8ec8d"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:30:30 crc kubenswrapper[4744]: I1205 20:30:30.281229 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/508527be-df29-4021-af61-ecd106e8ec8d-kube-api-access-zvmv9" (OuterVolumeSpecName: "kube-api-access-zvmv9") pod "508527be-df29-4021-af61-ecd106e8ec8d" (UID: "508527be-df29-4021-af61-ecd106e8ec8d"). InnerVolumeSpecName "kube-api-access-zvmv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:30:30 crc kubenswrapper[4744]: I1205 20:30:30.301466 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/508527be-df29-4021-af61-ecd106e8ec8d-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "508527be-df29-4021-af61-ecd106e8ec8d" (UID: "508527be-df29-4021-af61-ecd106e8ec8d"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:30:30 crc kubenswrapper[4744]: I1205 20:30:30.375260 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvmv9\" (UniqueName: \"kubernetes.io/projected/508527be-df29-4021-af61-ecd106e8ec8d-kube-api-access-zvmv9\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:30 crc kubenswrapper[4744]: I1205 20:30:30.375327 4744 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/508527be-df29-4021-af61-ecd106e8ec8d-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:30 crc kubenswrapper[4744]: I1205 20:30:30.375337 4744 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/508527be-df29-4021-af61-ecd106e8ec8d-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:30:30 crc kubenswrapper[4744]: I1205 20:30:30.712253 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" event={"ID":"508527be-df29-4021-af61-ecd106e8ec8d","Type":"ContainerDied","Data":"879fc5a75210426918b152d4d3a9173a62127ec384fe4bad9725c4f28b1a5bff"} Dec 05 20:30:30 crc kubenswrapper[4744]: I1205 20:30:30.712306 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj" Dec 05 20:30:30 crc kubenswrapper[4744]: I1205 20:30:30.712330 4744 scope.go:117] "RemoveContainer" containerID="6096f9f06b861a26c5c2e7ef95d01618c68c6d9994bb3a0c406f1efc1a92f572" Dec 05 20:30:30 crc kubenswrapper[4744]: I1205 20:30:30.742356 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj"] Dec 05 20:30:30 crc kubenswrapper[4744]: I1205 20:30:30.751868 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-d8cd4495-tf7zj"] Dec 05 20:30:32 crc kubenswrapper[4744]: I1205 20:30:32.098900 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="508527be-df29-4021-af61-ecd106e8ec8d" path="/var/lib/kubelet/pods/508527be-df29-4021-af61-ecd106e8ec8d/volumes" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.567514 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Dec 05 20:30:38 crc kubenswrapper[4744]: E1205 20:30:38.568269 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="508527be-df29-4021-af61-ecd106e8ec8d" containerName="manager" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.568281 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="508527be-df29-4021-af61-ecd106e8ec8d" containerName="manager" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.568430 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="508527be-df29-4021-af61-ecd106e8ec8d" containerName="manager" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.569110 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.570920 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-server-conf" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.571984 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-plugins-conf" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.573065 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"kube-root-ca.crt" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.573478 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openshift-service-ca.crt" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.573614 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-erlang-cookie" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.573627 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-default-user" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.573647 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-rabbitmq-notifications-svc" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.573719 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-config-data" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.573842 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-server-dockercfg-g2k9w" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.583340 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.684233 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.684301 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.684332 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rj7q\" (UniqueName: \"kubernetes.io/projected/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-kube-api-access-9rj7q\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.684359 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.684397 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.684435 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.684452 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.684473 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.684512 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.684534 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.684572 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6a4f53d6-0e0a-45f3-bddc-87904f0c4c19\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6a4f53d6-0e0a-45f3-bddc-87904f0c4c19\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.785521 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.785579 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.785632 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.785654 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.785680 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.785711 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.785735 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.785773 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6a4f53d6-0e0a-45f3-bddc-87904f0c4c19\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6a4f53d6-0e0a-45f3-bddc-87904f0c4c19\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.785830 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.785903 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.785958 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rj7q\" (UniqueName: \"kubernetes.io/projected/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-kube-api-access-9rj7q\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.786367 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.786811 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.788339 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.788408 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.788727 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.793203 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.793795 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.793827 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.794793 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.796310 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.796341 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6a4f53d6-0e0a-45f3-bddc-87904f0c4c19\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6a4f53d6-0e0a-45f3-bddc-87904f0c4c19\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cae039eab64d793f6a4b155d29ce5c78dee09b2ac22c2e6b435910b34e9c2c6d/globalmount\"" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.818459 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rj7q\" (UniqueName: \"kubernetes.io/projected/c4dae229-7a1c-4eb8-8932-7fd75e348bb2-kube-api-access-9rj7q\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.826380 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6a4f53d6-0e0a-45f3-bddc-87904f0c4c19\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6a4f53d6-0e0a-45f3-bddc-87904f0c4c19\") pod \"rabbitmq-notifications-server-0\" (UID: \"c4dae229-7a1c-4eb8-8932-7fd75e348bb2\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:38 crc kubenswrapper[4744]: I1205 20:30:38.892442 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.163520 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.164686 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.167888 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-server-dockercfg-m5n9w" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.168132 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-server-conf" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.168282 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-rabbitmq-svc" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.169775 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-default-user" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.169917 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-config-data" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.171916 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-plugins-conf" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.173374 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-erlang-cookie" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.186520 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.294024 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cfb456f7-66c1-4493-85d4-bae3322914f9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.294072 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfb456f7-66c1-4493-85d4-bae3322914f9-config-data\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.294099 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7599923e-95dc-4d77-8b79-84f0a96d5018\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7599923e-95dc-4d77-8b79-84f0a96d5018\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.294118 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cfb456f7-66c1-4493-85d4-bae3322914f9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.294139 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cfb456f7-66c1-4493-85d4-bae3322914f9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.294166 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cfb456f7-66c1-4493-85d4-bae3322914f9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.294181 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cfb456f7-66c1-4493-85d4-bae3322914f9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.294221 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cfb456f7-66c1-4493-85d4-bae3322914f9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.294238 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cfb456f7-66c1-4493-85d4-bae3322914f9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.294265 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cfb456f7-66c1-4493-85d4-bae3322914f9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.294301 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz8fp\" (UniqueName: \"kubernetes.io/projected/cfb456f7-66c1-4493-85d4-bae3322914f9-kube-api-access-xz8fp\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.371633 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.395088 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cfb456f7-66c1-4493-85d4-bae3322914f9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.395137 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cfb456f7-66c1-4493-85d4-bae3322914f9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.395165 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz8fp\" (UniqueName: \"kubernetes.io/projected/cfb456f7-66c1-4493-85d4-bae3322914f9-kube-api-access-xz8fp\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.395206 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cfb456f7-66c1-4493-85d4-bae3322914f9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.395228 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfb456f7-66c1-4493-85d4-bae3322914f9-config-data\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.395255 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7599923e-95dc-4d77-8b79-84f0a96d5018\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7599923e-95dc-4d77-8b79-84f0a96d5018\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.395273 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cfb456f7-66c1-4493-85d4-bae3322914f9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.395304 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cfb456f7-66c1-4493-85d4-bae3322914f9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.395340 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cfb456f7-66c1-4493-85d4-bae3322914f9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.395363 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cfb456f7-66c1-4493-85d4-bae3322914f9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.395413 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cfb456f7-66c1-4493-85d4-bae3322914f9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.395872 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cfb456f7-66c1-4493-85d4-bae3322914f9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.396261 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cfb456f7-66c1-4493-85d4-bae3322914f9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.396400 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfb456f7-66c1-4493-85d4-bae3322914f9-config-data\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.396668 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cfb456f7-66c1-4493-85d4-bae3322914f9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.397700 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cfb456f7-66c1-4493-85d4-bae3322914f9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.399865 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.399903 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7599923e-95dc-4d77-8b79-84f0a96d5018\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7599923e-95dc-4d77-8b79-84f0a96d5018\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/deb62399953271e555216ed13b3b322a00837fafa905fe9544578a0354fb7db3/globalmount\"" pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.402192 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cfb456f7-66c1-4493-85d4-bae3322914f9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.402916 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cfb456f7-66c1-4493-85d4-bae3322914f9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.403807 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cfb456f7-66c1-4493-85d4-bae3322914f9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.403914 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cfb456f7-66c1-4493-85d4-bae3322914f9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.424874 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz8fp\" (UniqueName: \"kubernetes.io/projected/cfb456f7-66c1-4493-85d4-bae3322914f9-kube-api-access-xz8fp\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.437444 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7599923e-95dc-4d77-8b79-84f0a96d5018\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7599923e-95dc-4d77-8b79-84f0a96d5018\") pod \"rabbitmq-server-0\" (UID: \"cfb456f7-66c1-4493-85d4-bae3322914f9\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.491030 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.772607 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"c4dae229-7a1c-4eb8-8932-7fd75e348bb2","Type":"ContainerStarted","Data":"7480e5c496e82215b25fbc54ebb38ac63564cdc54fd0728cc55fda7818effaad"} Dec 05 20:30:39 crc kubenswrapper[4744]: I1205 20:30:39.940412 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Dec 05 20:30:39 crc kubenswrapper[4744]: W1205 20:30:39.949448 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfb456f7_66c1_4493_85d4_bae3322914f9.slice/crio-56d9a1dfd3a4d11cfee1252274b5876066a663b765da5671b68cdf425ca7d9b7 WatchSource:0}: Error finding container 56d9a1dfd3a4d11cfee1252274b5876066a663b765da5671b68cdf425ca7d9b7: Status 404 returned error can't find the container with id 56d9a1dfd3a4d11cfee1252274b5876066a663b765da5671b68cdf425ca7d9b7 Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.576496 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.577948 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.579461 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-galera-openstack-svc" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.582191 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-config-data" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.583318 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"galera-openstack-dockercfg-9zgw8" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.583937 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-scripts" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.587994 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"combined-ca-bundle" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.589499 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.614418 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8aefca6-22e2-4e40-9287-3e0fec292264-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.614461 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a8aefca6-22e2-4e40-9287-3e0fec292264-kolla-config\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.614492 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8aefca6-22e2-4e40-9287-3e0fec292264-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.614510 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a8aefca6-22e2-4e40-9287-3e0fec292264-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.614537 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtcgp\" (UniqueName: \"kubernetes.io/projected/a8aefca6-22e2-4e40-9287-3e0fec292264-kube-api-access-vtcgp\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.614728 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8aefca6-22e2-4e40-9287-3e0fec292264-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.614790 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8aa0b487-aab4-49b6-a019-57324ba1c0cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8aa0b487-aab4-49b6-a019-57324ba1c0cd\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.614817 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a8aefca6-22e2-4e40-9287-3e0fec292264-config-data-default\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.716540 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8aefca6-22e2-4e40-9287-3e0fec292264-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.716594 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a8aefca6-22e2-4e40-9287-3e0fec292264-kolla-config\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.716627 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8aefca6-22e2-4e40-9287-3e0fec292264-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.716647 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a8aefca6-22e2-4e40-9287-3e0fec292264-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.716669 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtcgp\" (UniqueName: \"kubernetes.io/projected/a8aefca6-22e2-4e40-9287-3e0fec292264-kube-api-access-vtcgp\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.716703 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8aefca6-22e2-4e40-9287-3e0fec292264-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.716722 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8aa0b487-aab4-49b6-a019-57324ba1c0cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8aa0b487-aab4-49b6-a019-57324ba1c0cd\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.716741 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a8aefca6-22e2-4e40-9287-3e0fec292264-config-data-default\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.718826 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a8aefca6-22e2-4e40-9287-3e0fec292264-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.719340 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a8aefca6-22e2-4e40-9287-3e0fec292264-kolla-config\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.719591 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a8aefca6-22e2-4e40-9287-3e0fec292264-config-data-default\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.719643 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8aefca6-22e2-4e40-9287-3e0fec292264-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.727999 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8aefca6-22e2-4e40-9287-3e0fec292264-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.733324 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.733366 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8aa0b487-aab4-49b6-a019-57324ba1c0cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8aa0b487-aab4-49b6-a019-57324ba1c0cd\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b75b55be179284ea404d26dc9ccf7e3198fb906528381f345d257e5b7d3b43e8/globalmount\"" pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.735544 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtcgp\" (UniqueName: \"kubernetes.io/projected/a8aefca6-22e2-4e40-9287-3e0fec292264-kube-api-access-vtcgp\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.737423 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8aefca6-22e2-4e40-9287-3e0fec292264-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.771777 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8aa0b487-aab4-49b6-a019-57324ba1c0cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8aa0b487-aab4-49b6-a019-57324ba1c0cd\") pod \"openstack-galera-0\" (UID: \"a8aefca6-22e2-4e40-9287-3e0fec292264\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.786249 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"cfb456f7-66c1-4493-85d4-bae3322914f9","Type":"ContainerStarted","Data":"56d9a1dfd3a4d11cfee1252274b5876066a663b765da5671b68cdf425ca7d9b7"} Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.862667 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.865676 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.867831 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"memcached-memcached-dockercfg-r7jzb" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.867879 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-memcached-svc" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.868329 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"memcached-config-data" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.884586 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.919436 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7shd\" (UniqueName: \"kubernetes.io/projected/541c0230-6b36-4415-b8c6-9307b6529783-kube-api-access-j7shd\") pod \"memcached-0\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.919491 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/541c0230-6b36-4415-b8c6-9307b6529783-config-data\") pod \"memcached-0\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.919514 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/541c0230-6b36-4415-b8c6-9307b6529783-kolla-config\") pod \"memcached-0\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.919663 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/541c0230-6b36-4415-b8c6-9307b6529783-memcached-tls-certs\") pod \"memcached-0\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.919865 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541c0230-6b36-4415-b8c6-9307b6529783-combined-ca-bundle\") pod \"memcached-0\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:40 crc kubenswrapper[4744]: I1205 20:30:40.927858 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.021845 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/541c0230-6b36-4415-b8c6-9307b6529783-kolla-config\") pod \"memcached-0\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.021905 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/541c0230-6b36-4415-b8c6-9307b6529783-memcached-tls-certs\") pod \"memcached-0\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.021982 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541c0230-6b36-4415-b8c6-9307b6529783-combined-ca-bundle\") pod \"memcached-0\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.022053 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7shd\" (UniqueName: \"kubernetes.io/projected/541c0230-6b36-4415-b8c6-9307b6529783-kube-api-access-j7shd\") pod \"memcached-0\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.022092 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/541c0230-6b36-4415-b8c6-9307b6529783-config-data\") pod \"memcached-0\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.022895 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/541c0230-6b36-4415-b8c6-9307b6529783-kolla-config\") pod \"memcached-0\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.023087 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/541c0230-6b36-4415-b8c6-9307b6529783-config-data\") pod \"memcached-0\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.028832 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541c0230-6b36-4415-b8c6-9307b6529783-combined-ca-bundle\") pod \"memcached-0\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.035374 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/541c0230-6b36-4415-b8c6-9307b6529783-memcached-tls-certs\") pod \"memcached-0\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.064072 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7shd\" (UniqueName: \"kubernetes.io/projected/541c0230-6b36-4415-b8c6-9307b6529783-kube-api-access-j7shd\") pod \"memcached-0\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.197091 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.286752 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.288381 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.291762 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"telemetry-ceilometer-dockercfg-t499p" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.299911 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.326425 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drmlg\" (UniqueName: \"kubernetes.io/projected/5b28c681-e337-45a6-b41f-37ce1c0cc03b-kube-api-access-drmlg\") pod \"kube-state-metrics-0\" (UID: \"5b28c681-e337-45a6-b41f-37ce1c0cc03b\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.500593 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drmlg\" (UniqueName: \"kubernetes.io/projected/5b28c681-e337-45a6-b41f-37ce1c0cc03b-kube-api-access-drmlg\") pod \"kube-state-metrics-0\" (UID: \"5b28c681-e337-45a6-b41f-37ce1c0cc03b\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.537570 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drmlg\" (UniqueName: \"kubernetes.io/projected/5b28c681-e337-45a6-b41f-37ce1c0cc03b-kube-api-access-drmlg\") pod \"kube-state-metrics-0\" (UID: \"5b28c681-e337-45a6-b41f-37ce1c0cc03b\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.568689 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.647505 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:30:41 crc kubenswrapper[4744]: I1205 20:30:41.829467 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"a8aefca6-22e2-4e40-9287-3e0fec292264","Type":"ContainerStarted","Data":"68a6e3f4ee237d79894364b04d490967af0e1411c2b01847ebc9c142f9f340cf"} Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.232569 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.244115 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.255714 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-tls-assets-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.255945 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-cluster-tls-config" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.256131 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-web-config" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.256169 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-alertmanager-dockercfg-zvf9k" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.259355 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.261177 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-generated" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.314814 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.419531 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/483c94a6-6fac-4036-8b54-d22abbf49164-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.419633 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/483c94a6-6fac-4036-8b54-d22abbf49164-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.420509 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9lkc\" (UniqueName: \"kubernetes.io/projected/483c94a6-6fac-4036-8b54-d22abbf49164-kube-api-access-r9lkc\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.420623 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/483c94a6-6fac-4036-8b54-d22abbf49164-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.420748 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/483c94a6-6fac-4036-8b54-d22abbf49164-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.420794 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/483c94a6-6fac-4036-8b54-d22abbf49164-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.420824 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/483c94a6-6fac-4036-8b54-d22abbf49164-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.521747 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/483c94a6-6fac-4036-8b54-d22abbf49164-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.521812 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/483c94a6-6fac-4036-8b54-d22abbf49164-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.521847 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9lkc\" (UniqueName: \"kubernetes.io/projected/483c94a6-6fac-4036-8b54-d22abbf49164-kube-api-access-r9lkc\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.521883 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/483c94a6-6fac-4036-8b54-d22abbf49164-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.521929 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/483c94a6-6fac-4036-8b54-d22abbf49164-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.521951 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/483c94a6-6fac-4036-8b54-d22abbf49164-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.521970 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/483c94a6-6fac-4036-8b54-d22abbf49164-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.527713 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/483c94a6-6fac-4036-8b54-d22abbf49164-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.530527 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/483c94a6-6fac-4036-8b54-d22abbf49164-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.536468 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/483c94a6-6fac-4036-8b54-d22abbf49164-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.537491 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/483c94a6-6fac-4036-8b54-d22abbf49164-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.539484 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/483c94a6-6fac-4036-8b54-d22abbf49164-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.542075 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/483c94a6-6fac-4036-8b54-d22abbf49164-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.545647 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9lkc\" (UniqueName: \"kubernetes.io/projected/483c94a6-6fac-4036-8b54-d22abbf49164-kube-api-access-r9lkc\") pod \"alertmanager-metric-storage-0\" (UID: \"483c94a6-6fac-4036-8b54-d22abbf49164\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.579002 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.612066 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-pn2kn"] Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.613410 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-pn2kn" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.629042 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.629308 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-n6bd8" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.647784 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-pn2kn"] Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.682438 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.724239 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfnfw\" (UniqueName: \"kubernetes.io/projected/498942c9-c035-4d3f-a38b-a05221dc46c3-kube-api-access-vfnfw\") pod \"observability-ui-dashboards-7d5fb4cbfb-pn2kn\" (UID: \"498942c9-c035-4d3f-a38b-a05221dc46c3\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-pn2kn" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.724307 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/498942c9-c035-4d3f-a38b-a05221dc46c3-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-pn2kn\" (UID: \"498942c9-c035-4d3f-a38b-a05221dc46c3\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-pn2kn" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.744589 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.746374 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.755007 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-web-config" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.755797 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.755950 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.760657 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-tls-assets-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.760834 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.760942 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-prometheus-dockercfg-pzqcl" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.819360 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.827000 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfnfw\" (UniqueName: \"kubernetes.io/projected/498942c9-c035-4d3f-a38b-a05221dc46c3-kube-api-access-vfnfw\") pod \"observability-ui-dashboards-7d5fb4cbfb-pn2kn\" (UID: \"498942c9-c035-4d3f-a38b-a05221dc46c3\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-pn2kn" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.827051 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/498942c9-c035-4d3f-a38b-a05221dc46c3-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-pn2kn\" (UID: \"498942c9-c035-4d3f-a38b-a05221dc46c3\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-pn2kn" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.835917 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/498942c9-c035-4d3f-a38b-a05221dc46c3-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-pn2kn\" (UID: \"498942c9-c035-4d3f-a38b-a05221dc46c3\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-pn2kn" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.843127 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"5b28c681-e337-45a6-b41f-37ce1c0cc03b","Type":"ContainerStarted","Data":"00f761df3a4d7b6f59687ddfd6fadf79018640b87d88b16d40db2f5615e824a3"} Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.859117 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfnfw\" (UniqueName: \"kubernetes.io/projected/498942c9-c035-4d3f-a38b-a05221dc46c3-kube-api-access-vfnfw\") pod \"observability-ui-dashboards-7d5fb4cbfb-pn2kn\" (UID: \"498942c9-c035-4d3f-a38b-a05221dc46c3\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-pn2kn" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.860526 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"541c0230-6b36-4415-b8c6-9307b6529783","Type":"ContainerStarted","Data":"e0218214aac5d2361ea00d88932191aca09246f4d875ea8365dfbba716ce5b38"} Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.928814 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/663dea5b-3cc7-4c28-9803-302e771b8556-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.928860 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.928893 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/663dea5b-3cc7-4c28-9803-302e771b8556-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.928914 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/663dea5b-3cc7-4c28-9803-302e771b8556-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.928946 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpjlh\" (UniqueName: \"kubernetes.io/projected/663dea5b-3cc7-4c28-9803-302e771b8556-kube-api-access-cpjlh\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.928981 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-config\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.928997 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.929023 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.957126 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bb4d99b4f-pt8fl"] Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.958281 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:42 crc kubenswrapper[4744]: I1205 20:30:42.973856 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bb4d99b4f-pt8fl"] Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.002899 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-pn2kn" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.034483 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/663dea5b-3cc7-4c28-9803-302e771b8556-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.034557 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.034610 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/663dea5b-3cc7-4c28-9803-302e771b8556-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.034637 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/663dea5b-3cc7-4c28-9803-302e771b8556-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.034719 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpjlh\" (UniqueName: \"kubernetes.io/projected/663dea5b-3cc7-4c28-9803-302e771b8556-kube-api-access-cpjlh\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.034798 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-config\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.034825 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.034861 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.035703 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/663dea5b-3cc7-4c28-9803-302e771b8556-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.040725 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.040760 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6ddf1f7acec4066f68d4ea259a334af34aba683a37b9f2a47c0ede5bd328023f/globalmount\"" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.046508 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.048679 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/663dea5b-3cc7-4c28-9803-302e771b8556-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.048938 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/663dea5b-3cc7-4c28-9803-302e771b8556-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.052817 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-config\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.055432 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.097985 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpjlh\" (UniqueName: \"kubernetes.io/projected/663dea5b-3cc7-4c28-9803-302e771b8556-kube-api-access-cpjlh\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.137473 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6aeaf73c-f120-4389-9bfe-9dff96e919e3-console-serving-cert\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.137518 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6aeaf73c-f120-4389-9bfe-9dff96e919e3-trusted-ca-bundle\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.137547 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6aeaf73c-f120-4389-9bfe-9dff96e919e3-console-oauth-config\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.137566 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6aeaf73c-f120-4389-9bfe-9dff96e919e3-oauth-serving-cert\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.137622 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6aeaf73c-f120-4389-9bfe-9dff96e919e3-console-config\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.137671 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6aeaf73c-f120-4389-9bfe-9dff96e919e3-service-ca\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.137705 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm6st\" (UniqueName: \"kubernetes.io/projected/6aeaf73c-f120-4389-9bfe-9dff96e919e3-kube-api-access-cm6st\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.189327 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\") pod \"prometheus-metric-storage-0\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.239510 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6aeaf73c-f120-4389-9bfe-9dff96e919e3-service-ca\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.239621 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm6st\" (UniqueName: \"kubernetes.io/projected/6aeaf73c-f120-4389-9bfe-9dff96e919e3-kube-api-access-cm6st\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.240025 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6aeaf73c-f120-4389-9bfe-9dff96e919e3-console-serving-cert\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.240318 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6aeaf73c-f120-4389-9bfe-9dff96e919e3-service-ca\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.240644 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6aeaf73c-f120-4389-9bfe-9dff96e919e3-trusted-ca-bundle\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.240694 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6aeaf73c-f120-4389-9bfe-9dff96e919e3-console-oauth-config\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.240720 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6aeaf73c-f120-4389-9bfe-9dff96e919e3-oauth-serving-cert\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.240782 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6aeaf73c-f120-4389-9bfe-9dff96e919e3-console-config\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.240998 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6aeaf73c-f120-4389-9bfe-9dff96e919e3-trusted-ca-bundle\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.241544 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6aeaf73c-f120-4389-9bfe-9dff96e919e3-oauth-serving-cert\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.241583 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6aeaf73c-f120-4389-9bfe-9dff96e919e3-console-config\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.243382 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6aeaf73c-f120-4389-9bfe-9dff96e919e3-console-oauth-config\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.244325 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6aeaf73c-f120-4389-9bfe-9dff96e919e3-console-serving-cert\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.258744 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm6st\" (UniqueName: \"kubernetes.io/projected/6aeaf73c-f120-4389-9bfe-9dff96e919e3-kube-api-access-cm6st\") pod \"console-7bb4d99b4f-pt8fl\" (UID: \"6aeaf73c-f120-4389-9bfe-9dff96e919e3\") " pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.334235 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.376947 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.400718 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Dec 05 20:30:43 crc kubenswrapper[4744]: W1205 20:30:43.538668 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod483c94a6_6fac_4036_8b54_d22abbf49164.slice/crio-fe30a98e24374bba266152d896c9733fb4b1f5f508fa0250effbf6d6542cf2c3 WatchSource:0}: Error finding container fe30a98e24374bba266152d896c9733fb4b1f5f508fa0250effbf6d6542cf2c3: Status 404 returned error can't find the container with id fe30a98e24374bba266152d896c9733fb4b1f5f508fa0250effbf6d6542cf2c3 Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.659211 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-pn2kn"] Dec 05 20:30:43 crc kubenswrapper[4744]: I1205 20:30:43.881081 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"483c94a6-6fac-4036-8b54-d22abbf49164","Type":"ContainerStarted","Data":"fe30a98e24374bba266152d896c9733fb4b1f5f508fa0250effbf6d6542cf2c3"} Dec 05 20:30:44 crc kubenswrapper[4744]: I1205 20:30:44.128326 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bb4d99b4f-pt8fl"] Dec 05 20:30:44 crc kubenswrapper[4744]: I1205 20:30:44.264726 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 20:30:44 crc kubenswrapper[4744]: I1205 20:30:44.916192 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-pn2kn" event={"ID":"498942c9-c035-4d3f-a38b-a05221dc46c3","Type":"ContainerStarted","Data":"3ca60c18dc23bca640a71f014e6a8e54b23b76b3253cb9ddc8d5db00f211dca6"} Dec 05 20:30:44 crc kubenswrapper[4744]: W1205 20:30:44.934274 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aeaf73c_f120_4389_9bfe_9dff96e919e3.slice/crio-86b1bb728d0650e2071d5c35a5416bc2bf81a6d1596cb0692cf87433e72b314e WatchSource:0}: Error finding container 86b1bb728d0650e2071d5c35a5416bc2bf81a6d1596cb0692cf87433e72b314e: Status 404 returned error can't find the container with id 86b1bb728d0650e2071d5c35a5416bc2bf81a6d1596cb0692cf87433e72b314e Dec 05 20:30:45 crc kubenswrapper[4744]: I1205 20:30:45.939843 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bb4d99b4f-pt8fl" event={"ID":"6aeaf73c-f120-4389-9bfe-9dff96e919e3","Type":"ContainerStarted","Data":"86b1bb728d0650e2071d5c35a5416bc2bf81a6d1596cb0692cf87433e72b314e"} Dec 05 20:30:45 crc kubenswrapper[4744]: I1205 20:30:45.941827 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"663dea5b-3cc7-4c28-9803-302e771b8556","Type":"ContainerStarted","Data":"e11c5ce2c8c9f4600cf23a4808e3a9722efc5ace53fc0134a29cdf73bcf75428"} Dec 05 20:30:49 crc kubenswrapper[4744]: I1205 20:30:49.807069 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:30:49 crc kubenswrapper[4744]: I1205 20:30:49.807637 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:30:55 crc kubenswrapper[4744]: E1205 20:30:55.681607 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 05 20:30:55 crc kubenswrapper[4744]: E1205 20:30:55.682257 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9rj7q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-notifications-server-0_watcher-kuttl-default(c4dae229-7a1c-4eb8-8932-7fd75e348bb2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:30:55 crc kubenswrapper[4744]: E1205 20:30:55.683460 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" podUID="c4dae229-7a1c-4eb8-8932-7fd75e348bb2" Dec 05 20:30:55 crc kubenswrapper[4744]: E1205 20:30:55.695501 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 05 20:30:55 crc kubenswrapper[4744]: E1205 20:30:55.695717 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xz8fp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_watcher-kuttl-default(cfb456f7-66c1-4493-85d4-bae3322914f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:30:55 crc kubenswrapper[4744]: E1205 20:30:55.697010 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/rabbitmq-server-0" podUID="cfb456f7-66c1-4493-85d4-bae3322914f9" Dec 05 20:30:56 crc kubenswrapper[4744]: E1205 20:30:56.063371 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" podUID="c4dae229-7a1c-4eb8-8932-7fd75e348bb2" Dec 05 20:30:56 crc kubenswrapper[4744]: E1205 20:30:56.064125 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="watcher-kuttl-default/rabbitmq-server-0" podUID="cfb456f7-66c1-4493-85d4-bae3322914f9" Dec 05 20:30:57 crc kubenswrapper[4744]: E1205 20:30:57.866640 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 05 20:30:57 crc kubenswrapper[4744]: E1205 20:30:57.867061 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vtcgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_watcher-kuttl-default(a8aefca6-22e2-4e40-9287-3e0fec292264): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:30:57 crc kubenswrapper[4744]: E1205 20:30:57.868263 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/openstack-galera-0" podUID="a8aefca6-22e2-4e40-9287-3e0fec292264" Dec 05 20:30:58 crc kubenswrapper[4744]: E1205 20:30:58.094883 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="watcher-kuttl-default/openstack-galera-0" podUID="a8aefca6-22e2-4e40-9287-3e0fec292264" Dec 05 20:30:58 crc kubenswrapper[4744]: E1205 20:30:58.620711 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 05 20:30:58 crc kubenswrapper[4744]: E1205 20:30:58.620942 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:nb5h65fh78h566h98hb9h665h78h67dhf4h649h6bh5dh67fh74h64bh595h678h5dfh57dh5ddh59fh589h85hddh5cfh7dh559h57bh676h4h694q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j7shd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_watcher-kuttl-default(541c0230-6b36-4415-b8c6-9307b6529783): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:30:58 crc kubenswrapper[4744]: E1205 20:30:58.622251 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/memcached-0" podUID="541c0230-6b36-4415-b8c6-9307b6529783" Dec 05 20:30:59 crc kubenswrapper[4744]: E1205 20:30:59.099600 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="watcher-kuttl-default/memcached-0" podUID="541c0230-6b36-4415-b8c6-9307b6529783" Dec 05 20:30:59 crc kubenswrapper[4744]: E1205 20:30:59.636504 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 05 20:30:59 crc kubenswrapper[4744]: E1205 20:30:59.636877 4744 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 05 20:30:59 crc kubenswrapper[4744]: E1205 20:30:59.637019 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=watcher-kuttl-default],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-drmlg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_watcher-kuttl-default(5b28c681-e337-45a6-b41f-37ce1c0cc03b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:30:59 crc kubenswrapper[4744]: E1205 20:30:59.638346 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="watcher-kuttl-default/kube-state-metrics-0" podUID="5b28c681-e337-45a6-b41f-37ce1c0cc03b" Dec 05 20:31:00 crc kubenswrapper[4744]: I1205 20:31:00.106098 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bb4d99b4f-pt8fl" event={"ID":"6aeaf73c-f120-4389-9bfe-9dff96e919e3","Type":"ContainerStarted","Data":"3b043817d01bb29605d54f512c7254055327b6562655cd9a8a762bf86f483cb6"} Dec 05 20:31:00 crc kubenswrapper[4744]: I1205 20:31:00.108528 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-pn2kn" event={"ID":"498942c9-c035-4d3f-a38b-a05221dc46c3","Type":"ContainerStarted","Data":"f633d2d3065feff790ebe0a04b9319d4f070a4e7414c964f3cd67b70464bed4c"} Dec 05 20:31:00 crc kubenswrapper[4744]: E1205 20:31:00.109446 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="watcher-kuttl-default/kube-state-metrics-0" podUID="5b28c681-e337-45a6-b41f-37ce1c0cc03b" Dec 05 20:31:00 crc kubenswrapper[4744]: I1205 20:31:00.146145 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bb4d99b4f-pt8fl" podStartSLOduration=18.14612358 podStartE2EDuration="18.14612358s" podCreationTimestamp="2025-12-05 20:30:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:31:00.124921547 +0000 UTC m=+1230.354732935" watchObservedRunningTime="2025-12-05 20:31:00.14612358 +0000 UTC m=+1230.375934968" Dec 05 20:31:00 crc kubenswrapper[4744]: I1205 20:31:00.163581 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-pn2kn" podStartSLOduration=2.852688762 podStartE2EDuration="18.163562951s" podCreationTimestamp="2025-12-05 20:30:42 +0000 UTC" firstStartedPulling="2025-12-05 20:30:43.923761844 +0000 UTC m=+1214.153573212" lastFinishedPulling="2025-12-05 20:30:59.234636033 +0000 UTC m=+1229.464447401" observedRunningTime="2025-12-05 20:31:00.158402324 +0000 UTC m=+1230.388213702" watchObservedRunningTime="2025-12-05 20:31:00.163562951 +0000 UTC m=+1230.393374319" Dec 05 20:31:03 crc kubenswrapper[4744]: I1205 20:31:03.134373 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"483c94a6-6fac-4036-8b54-d22abbf49164","Type":"ContainerStarted","Data":"649d5a4540b5e4582f4161ba90e7cd4593e7a94c675d4ac4fb4291b9a065b52e"} Dec 05 20:31:03 crc kubenswrapper[4744]: I1205 20:31:03.136109 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"663dea5b-3cc7-4c28-9803-302e771b8556","Type":"ContainerStarted","Data":"9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c"} Dec 05 20:31:03 crc kubenswrapper[4744]: I1205 20:31:03.335456 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:31:03 crc kubenswrapper[4744]: I1205 20:31:03.335508 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:31:03 crc kubenswrapper[4744]: I1205 20:31:03.339672 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:31:04 crc kubenswrapper[4744]: I1205 20:31:04.147521 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7bb4d99b4f-pt8fl" Dec 05 20:31:04 crc kubenswrapper[4744]: I1205 20:31:04.206311 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d675d5484-zjxxm"] Dec 05 20:31:10 crc kubenswrapper[4744]: I1205 20:31:10.192238 4744 generic.go:334] "Generic (PLEG): container finished" podID="483c94a6-6fac-4036-8b54-d22abbf49164" containerID="649d5a4540b5e4582f4161ba90e7cd4593e7a94c675d4ac4fb4291b9a065b52e" exitCode=0 Dec 05 20:31:10 crc kubenswrapper[4744]: I1205 20:31:10.192642 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"483c94a6-6fac-4036-8b54-d22abbf49164","Type":"ContainerDied","Data":"649d5a4540b5e4582f4161ba90e7cd4593e7a94c675d4ac4fb4291b9a065b52e"} Dec 05 20:31:10 crc kubenswrapper[4744]: I1205 20:31:10.195663 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"c4dae229-7a1c-4eb8-8932-7fd75e348bb2","Type":"ContainerStarted","Data":"cc25a9e9d72d30ac5f2d89cc76735f430e59a33cd73780212408ec6dd28dc517"} Dec 05 20:31:10 crc kubenswrapper[4744]: I1205 20:31:10.197377 4744 generic.go:334] "Generic (PLEG): container finished" podID="663dea5b-3cc7-4c28-9803-302e771b8556" containerID="9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c" exitCode=0 Dec 05 20:31:10 crc kubenswrapper[4744]: I1205 20:31:10.197445 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"663dea5b-3cc7-4c28-9803-302e771b8556","Type":"ContainerDied","Data":"9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c"} Dec 05 20:31:10 crc kubenswrapper[4744]: I1205 20:31:10.198989 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"a8aefca6-22e2-4e40-9287-3e0fec292264","Type":"ContainerStarted","Data":"029cdac82bedda88eeb5ee51851beb53c2cf28440b325fbf7fc43991043a6eb0"} Dec 05 20:31:10 crc kubenswrapper[4744]: I1205 20:31:10.201366 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"cfb456f7-66c1-4493-85d4-bae3322914f9","Type":"ContainerStarted","Data":"6b850b78d21075d7781357596ff60640cd71cc5ea27699baeb128222a1c29db5"} Dec 05 20:31:12 crc kubenswrapper[4744]: I1205 20:31:12.235512 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"541c0230-6b36-4415-b8c6-9307b6529783","Type":"ContainerStarted","Data":"8c8cdebd28cf1a71333b0d7cd9bf63b3bf522de5a48ffb5c5962081988c8cf1e"} Dec 05 20:31:12 crc kubenswrapper[4744]: I1205 20:31:12.237175 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/memcached-0" Dec 05 20:31:12 crc kubenswrapper[4744]: I1205 20:31:12.240687 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"5b28c681-e337-45a6-b41f-37ce1c0cc03b","Type":"ContainerStarted","Data":"b9912a51a92948c0e12e41fd57b66a7f508fc0859b95e9f15a71830b37b67ad1"} Dec 05 20:31:12 crc kubenswrapper[4744]: I1205 20:31:12.241150 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:31:12 crc kubenswrapper[4744]: I1205 20:31:12.254639 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/memcached-0" podStartSLOduration=2.984445838 podStartE2EDuration="32.254623912s" podCreationTimestamp="2025-12-05 20:30:40 +0000 UTC" firstStartedPulling="2025-12-05 20:30:42.33024069 +0000 UTC m=+1212.560052058" lastFinishedPulling="2025-12-05 20:31:11.600418774 +0000 UTC m=+1241.830230132" observedRunningTime="2025-12-05 20:31:12.251383392 +0000 UTC m=+1242.481194790" watchObservedRunningTime="2025-12-05 20:31:12.254623912 +0000 UTC m=+1242.484435280" Dec 05 20:31:12 crc kubenswrapper[4744]: I1205 20:31:12.279026 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/kube-state-metrics-0" podStartSLOduration=2.400359027 podStartE2EDuration="31.279009074s" podCreationTimestamp="2025-12-05 20:30:41 +0000 UTC" firstStartedPulling="2025-12-05 20:30:42.727526079 +0000 UTC m=+1212.957337447" lastFinishedPulling="2025-12-05 20:31:11.606176126 +0000 UTC m=+1241.835987494" observedRunningTime="2025-12-05 20:31:12.266337351 +0000 UTC m=+1242.496148719" watchObservedRunningTime="2025-12-05 20:31:12.279009074 +0000 UTC m=+1242.508820442" Dec 05 20:31:16 crc kubenswrapper[4744]: I1205 20:31:16.199505 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/memcached-0" Dec 05 20:31:16 crc kubenswrapper[4744]: I1205 20:31:16.273182 4744 generic.go:334] "Generic (PLEG): container finished" podID="a8aefca6-22e2-4e40-9287-3e0fec292264" containerID="029cdac82bedda88eeb5ee51851beb53c2cf28440b325fbf7fc43991043a6eb0" exitCode=0 Dec 05 20:31:16 crc kubenswrapper[4744]: I1205 20:31:16.273223 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"a8aefca6-22e2-4e40-9287-3e0fec292264","Type":"ContainerDied","Data":"029cdac82bedda88eeb5ee51851beb53c2cf28440b325fbf7fc43991043a6eb0"} Dec 05 20:31:19 crc kubenswrapper[4744]: I1205 20:31:19.806955 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:31:19 crc kubenswrapper[4744]: I1205 20:31:19.807225 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:31:20 crc kubenswrapper[4744]: I1205 20:31:20.312607 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"663dea5b-3cc7-4c28-9803-302e771b8556","Type":"ContainerStarted","Data":"7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b"} Dec 05 20:31:20 crc kubenswrapper[4744]: I1205 20:31:20.315114 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"a8aefca6-22e2-4e40-9287-3e0fec292264","Type":"ContainerStarted","Data":"6b7a0de1bc160b969a717d3d778b47c381053c061a863fb7481b59a45a33e3bf"} Dec 05 20:31:20 crc kubenswrapper[4744]: I1205 20:31:20.318739 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"483c94a6-6fac-4036-8b54-d22abbf49164","Type":"ContainerStarted","Data":"5890d57de78efeb40f4e63ae8eded3f6df3e6b20fc7c3dc44c8d58306b512f3d"} Dec 05 20:31:20 crc kubenswrapper[4744]: I1205 20:31:20.341898 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/openstack-galera-0" podStartSLOduration=13.388707168 podStartE2EDuration="41.341882652s" podCreationTimestamp="2025-12-05 20:30:39 +0000 UTC" firstStartedPulling="2025-12-05 20:30:41.577809404 +0000 UTC m=+1211.807620772" lastFinishedPulling="2025-12-05 20:31:09.530984888 +0000 UTC m=+1239.760796256" observedRunningTime="2025-12-05 20:31:20.339396901 +0000 UTC m=+1250.569208269" watchObservedRunningTime="2025-12-05 20:31:20.341882652 +0000 UTC m=+1250.571694020" Dec 05 20:31:20 crc kubenswrapper[4744]: I1205 20:31:20.927996 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:31:20 crc kubenswrapper[4744]: I1205 20:31:20.928073 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:31:21 crc kubenswrapper[4744]: I1205 20:31:21.656443 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:31:23 crc kubenswrapper[4744]: I1205 20:31:23.343969 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"483c94a6-6fac-4036-8b54-d22abbf49164","Type":"ContainerStarted","Data":"cd092847096776b18c2e770d4b381e87ee7cb823b6390cb771a69e3b7e47f084"} Dec 05 20:31:23 crc kubenswrapper[4744]: I1205 20:31:23.344316 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:31:23 crc kubenswrapper[4744]: I1205 20:31:23.346086 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"663dea5b-3cc7-4c28-9803-302e771b8556","Type":"ContainerStarted","Data":"235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a"} Dec 05 20:31:23 crc kubenswrapper[4744]: I1205 20:31:23.351700 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 05 20:31:23 crc kubenswrapper[4744]: I1205 20:31:23.375452 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/alertmanager-metric-storage-0" podStartSLOduration=4.904103253 podStartE2EDuration="41.375427166s" podCreationTimestamp="2025-12-05 20:30:42 +0000 UTC" firstStartedPulling="2025-12-05 20:30:43.546958061 +0000 UTC m=+1213.776769429" lastFinishedPulling="2025-12-05 20:31:20.018281974 +0000 UTC m=+1250.248093342" observedRunningTime="2025-12-05 20:31:23.363741178 +0000 UTC m=+1253.593552546" watchObservedRunningTime="2025-12-05 20:31:23.375427166 +0000 UTC m=+1253.605238554" Dec 05 20:31:25 crc kubenswrapper[4744]: I1205 20:31:25.041131 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:31:25 crc kubenswrapper[4744]: I1205 20:31:25.150163 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/openstack-galera-0" Dec 05 20:31:26 crc kubenswrapper[4744]: I1205 20:31:26.376139 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"663dea5b-3cc7-4c28-9803-302e771b8556","Type":"ContainerStarted","Data":"08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1"} Dec 05 20:31:28 crc kubenswrapper[4744]: I1205 20:31:28.377130 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:28 crc kubenswrapper[4744]: I1205 20:31:28.377537 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:28 crc kubenswrapper[4744]: I1205 20:31:28.379484 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:28 crc kubenswrapper[4744]: I1205 20:31:28.393044 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:28 crc kubenswrapper[4744]: I1205 20:31:28.416926 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/prometheus-metric-storage-0" podStartSLOduration=6.330671156 podStartE2EDuration="47.416902347s" podCreationTimestamp="2025-12-05 20:30:41 +0000 UTC" firstStartedPulling="2025-12-05 20:30:44.938235926 +0000 UTC m=+1215.168047294" lastFinishedPulling="2025-12-05 20:31:26.024467087 +0000 UTC m=+1256.254278485" observedRunningTime="2025-12-05 20:31:26.406127771 +0000 UTC m=+1256.635939139" watchObservedRunningTime="2025-12-05 20:31:28.416902347 +0000 UTC m=+1258.646713725" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.283834 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-d675d5484-zjxxm" podUID="203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d" containerName="console" containerID="cri-o://fed1f2eaf2d53105dc8a5256cf9b31f5d69099d43244e82515aa6bfd6cfcb5f9" gracePeriod=15 Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.416906 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d675d5484-zjxxm_203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d/console/0.log" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.417223 4744 generic.go:334] "Generic (PLEG): container finished" podID="203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d" containerID="fed1f2eaf2d53105dc8a5256cf9b31f5d69099d43244e82515aa6bfd6cfcb5f9" exitCode=2 Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.417261 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d675d5484-zjxxm" event={"ID":"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d","Type":"ContainerDied","Data":"fed1f2eaf2d53105dc8a5256cf9b31f5d69099d43244e82515aa6bfd6cfcb5f9"} Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.833514 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d675d5484-zjxxm_203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d/console/0.log" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.833594 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.874883 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-trusted-ca-bundle\") pod \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.875210 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-config\") pod \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.875933 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-config" (OuterVolumeSpecName: "console-config") pod "203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d" (UID: "203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.876357 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d" (UID: "203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.876585 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-oauth-config\") pod \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.876762 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l4b9\" (UniqueName: \"kubernetes.io/projected/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-kube-api-access-4l4b9\") pod \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.876796 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-service-ca\") pod \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.876817 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-serving-cert\") pod \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.876836 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-oauth-serving-cert\") pod \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\" (UID: \"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d\") " Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.877218 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d" (UID: "203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.879073 4744 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.879370 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.879401 4744 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.882595 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d" (UID: "203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.883722 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-service-ca" (OuterVolumeSpecName: "service-ca") pod "203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d" (UID: "203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.891487 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d" (UID: "203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.912658 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-kube-api-access-4l4b9" (OuterVolumeSpecName: "kube-api-access-4l4b9") pod "203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d" (UID: "203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d"). InnerVolumeSpecName "kube-api-access-4l4b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.980286 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.980349 4744 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.980365 4744 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:29 crc kubenswrapper[4744]: I1205 20:31:29.980377 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l4b9\" (UniqueName: \"kubernetes.io/projected/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d-kube-api-access-4l4b9\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.426109 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d675d5484-zjxxm_203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d/console/0.log" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.426219 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d675d5484-zjxxm" event={"ID":"203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d","Type":"ContainerDied","Data":"b0f89f9f5aac0ccf95aa4780e6b03abebe337025c88d1f7b5c6932b72ca3bd43"} Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.426253 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d675d5484-zjxxm" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.426265 4744 scope.go:117] "RemoveContainer" containerID="fed1f2eaf2d53105dc8a5256cf9b31f5d69099d43244e82515aa6bfd6cfcb5f9" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.452021 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d675d5484-zjxxm"] Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.459192 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-d675d5484-zjxxm"] Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.825515 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-d357-account-create-update-g66fz"] Dec 05 20:31:30 crc kubenswrapper[4744]: E1205 20:31:30.826872 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d" containerName="console" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.826897 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d" containerName="console" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.827139 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d" containerName="console" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.827712 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-d357-account-create-update-g66fz" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.829622 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-db-secret" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.835830 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-d357-account-create-update-g66fz"] Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.881698 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-db-create-c2ltm"] Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.883168 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-c2ltm" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.894249 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08-operator-scripts\") pod \"keystone-d357-account-create-update-g66fz\" (UID: \"adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08\") " pod="watcher-kuttl-default/keystone-d357-account-create-update-g66fz" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.894439 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljrh9\" (UniqueName: \"kubernetes.io/projected/adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08-kube-api-access-ljrh9\") pod \"keystone-d357-account-create-update-g66fz\" (UID: \"adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08\") " pod="watcher-kuttl-default/keystone-d357-account-create-update-g66fz" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.894795 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-create-c2ltm"] Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.923085 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.996505 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzvfp\" (UniqueName: \"kubernetes.io/projected/cd5ce9df-fda1-446b-a224-9f4d1a93dc47-kube-api-access-fzvfp\") pod \"keystone-db-create-c2ltm\" (UID: \"cd5ce9df-fda1-446b-a224-9f4d1a93dc47\") " pod="watcher-kuttl-default/keystone-db-create-c2ltm" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.996637 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08-operator-scripts\") pod \"keystone-d357-account-create-update-g66fz\" (UID: \"adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08\") " pod="watcher-kuttl-default/keystone-d357-account-create-update-g66fz" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.996805 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd5ce9df-fda1-446b-a224-9f4d1a93dc47-operator-scripts\") pod \"keystone-db-create-c2ltm\" (UID: \"cd5ce9df-fda1-446b-a224-9f4d1a93dc47\") " pod="watcher-kuttl-default/keystone-db-create-c2ltm" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.996932 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljrh9\" (UniqueName: \"kubernetes.io/projected/adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08-kube-api-access-ljrh9\") pod \"keystone-d357-account-create-update-g66fz\" (UID: \"adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08\") " pod="watcher-kuttl-default/keystone-d357-account-create-update-g66fz" Dec 05 20:31:30 crc kubenswrapper[4744]: I1205 20:31:30.997940 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08-operator-scripts\") pod \"keystone-d357-account-create-update-g66fz\" (UID: \"adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08\") " pod="watcher-kuttl-default/keystone-d357-account-create-update-g66fz" Dec 05 20:31:31 crc kubenswrapper[4744]: I1205 20:31:31.021107 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljrh9\" (UniqueName: \"kubernetes.io/projected/adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08-kube-api-access-ljrh9\") pod \"keystone-d357-account-create-update-g66fz\" (UID: \"adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08\") " pod="watcher-kuttl-default/keystone-d357-account-create-update-g66fz" Dec 05 20:31:31 crc kubenswrapper[4744]: I1205 20:31:31.098578 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd5ce9df-fda1-446b-a224-9f4d1a93dc47-operator-scripts\") pod \"keystone-db-create-c2ltm\" (UID: \"cd5ce9df-fda1-446b-a224-9f4d1a93dc47\") " pod="watcher-kuttl-default/keystone-db-create-c2ltm" Dec 05 20:31:31 crc kubenswrapper[4744]: I1205 20:31:31.099073 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzvfp\" (UniqueName: \"kubernetes.io/projected/cd5ce9df-fda1-446b-a224-9f4d1a93dc47-kube-api-access-fzvfp\") pod \"keystone-db-create-c2ltm\" (UID: \"cd5ce9df-fda1-446b-a224-9f4d1a93dc47\") " pod="watcher-kuttl-default/keystone-db-create-c2ltm" Dec 05 20:31:31 crc kubenswrapper[4744]: I1205 20:31:31.099702 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd5ce9df-fda1-446b-a224-9f4d1a93dc47-operator-scripts\") pod \"keystone-db-create-c2ltm\" (UID: \"cd5ce9df-fda1-446b-a224-9f4d1a93dc47\") " pod="watcher-kuttl-default/keystone-db-create-c2ltm" Dec 05 20:31:31 crc kubenswrapper[4744]: I1205 20:31:31.114596 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzvfp\" (UniqueName: \"kubernetes.io/projected/cd5ce9df-fda1-446b-a224-9f4d1a93dc47-kube-api-access-fzvfp\") pod \"keystone-db-create-c2ltm\" (UID: \"cd5ce9df-fda1-446b-a224-9f4d1a93dc47\") " pod="watcher-kuttl-default/keystone-db-create-c2ltm" Dec 05 20:31:31 crc kubenswrapper[4744]: I1205 20:31:31.151654 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-d357-account-create-update-g66fz" Dec 05 20:31:31 crc kubenswrapper[4744]: I1205 20:31:31.205791 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-c2ltm" Dec 05 20:31:31 crc kubenswrapper[4744]: I1205 20:31:31.452323 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="663dea5b-3cc7-4c28-9803-302e771b8556" containerName="prometheus" containerID="cri-o://7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b" gracePeriod=600 Dec 05 20:31:31 crc kubenswrapper[4744]: I1205 20:31:31.452938 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="663dea5b-3cc7-4c28-9803-302e771b8556" containerName="thanos-sidecar" containerID="cri-o://08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1" gracePeriod=600 Dec 05 20:31:31 crc kubenswrapper[4744]: I1205 20:31:31.452978 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="663dea5b-3cc7-4c28-9803-302e771b8556" containerName="config-reloader" containerID="cri-o://235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a" gracePeriod=600 Dec 05 20:31:31 crc kubenswrapper[4744]: I1205 20:31:31.708992 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-create-c2ltm"] Dec 05 20:31:31 crc kubenswrapper[4744]: W1205 20:31:31.711464 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd5ce9df_fda1_446b_a224_9f4d1a93dc47.slice/crio-20a6c4c9e6b2f6e58f263c511ce1bcc9fcd35a36d779e5d1f788324398ded73f WatchSource:0}: Error finding container 20a6c4c9e6b2f6e58f263c511ce1bcc9fcd35a36d779e5d1f788324398ded73f: Status 404 returned error can't find the container with id 20a6c4c9e6b2f6e58f263c511ce1bcc9fcd35a36d779e5d1f788324398ded73f Dec 05 20:31:31 crc kubenswrapper[4744]: I1205 20:31:31.787415 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-d357-account-create-update-g66fz"] Dec 05 20:31:31 crc kubenswrapper[4744]: W1205 20:31:31.840546 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadce1d89_2b99_4f7c_ba03_0fcc1bb8ea08.slice/crio-52fd1f2d25acdfca892841572113fdbfe583c81f77ecc936d34ce6d2e5d40c1e WatchSource:0}: Error finding container 52fd1f2d25acdfca892841572113fdbfe583c81f77ecc936d34ce6d2e5d40c1e: Status 404 returned error can't find the container with id 52fd1f2d25acdfca892841572113fdbfe583c81f77ecc936d34ce6d2e5d40c1e Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.091863 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d" path="/var/lib/kubelet/pods/203cad7e-b0d1-4c93-b217-d7cf0c6f1c5d/volumes" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.417520 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.480215 4744 generic.go:334] "Generic (PLEG): container finished" podID="663dea5b-3cc7-4c28-9803-302e771b8556" containerID="08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1" exitCode=0 Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.480259 4744 generic.go:334] "Generic (PLEG): container finished" podID="663dea5b-3cc7-4c28-9803-302e771b8556" containerID="235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a" exitCode=0 Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.480270 4744 generic.go:334] "Generic (PLEG): container finished" podID="663dea5b-3cc7-4c28-9803-302e771b8556" containerID="7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b" exitCode=0 Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.480357 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"663dea5b-3cc7-4c28-9803-302e771b8556","Type":"ContainerDied","Data":"08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1"} Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.480395 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"663dea5b-3cc7-4c28-9803-302e771b8556","Type":"ContainerDied","Data":"235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a"} Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.480410 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"663dea5b-3cc7-4c28-9803-302e771b8556","Type":"ContainerDied","Data":"7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b"} Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.480424 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"663dea5b-3cc7-4c28-9803-302e771b8556","Type":"ContainerDied","Data":"e11c5ce2c8c9f4600cf23a4808e3a9722efc5ace53fc0134a29cdf73bcf75428"} Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.480444 4744 scope.go:117] "RemoveContainer" containerID="08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.480600 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.492908 4744 generic.go:334] "Generic (PLEG): container finished" podID="cd5ce9df-fda1-446b-a224-9f4d1a93dc47" containerID="8cd93ac303e7672e0dd178a58a7e51b745bd68ebdfe10909dc99676f764a3aab" exitCode=0 Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.492987 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-c2ltm" event={"ID":"cd5ce9df-fda1-446b-a224-9f4d1a93dc47","Type":"ContainerDied","Data":"8cd93ac303e7672e0dd178a58a7e51b745bd68ebdfe10909dc99676f764a3aab"} Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.493021 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-c2ltm" event={"ID":"cd5ce9df-fda1-446b-a224-9f4d1a93dc47","Type":"ContainerStarted","Data":"20a6c4c9e6b2f6e58f263c511ce1bcc9fcd35a36d779e5d1f788324398ded73f"} Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.494642 4744 generic.go:334] "Generic (PLEG): container finished" podID="adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08" containerID="ad5bb2e5c902f8f334770fc8c38d4b6378ad921244c8f313bbc0747f18bd6fd5" exitCode=0 Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.494679 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-d357-account-create-update-g66fz" event={"ID":"adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08","Type":"ContainerDied","Data":"ad5bb2e5c902f8f334770fc8c38d4b6378ad921244c8f313bbc0747f18bd6fd5"} Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.494723 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-d357-account-create-update-g66fz" event={"ID":"adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08","Type":"ContainerStarted","Data":"52fd1f2d25acdfca892841572113fdbfe583c81f77ecc936d34ce6d2e5d40c1e"} Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.517227 4744 scope.go:117] "RemoveContainer" containerID="235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.530961 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\") pod \"663dea5b-3cc7-4c28-9803-302e771b8556\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.531028 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpjlh\" (UniqueName: \"kubernetes.io/projected/663dea5b-3cc7-4c28-9803-302e771b8556-kube-api-access-cpjlh\") pod \"663dea5b-3cc7-4c28-9803-302e771b8556\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.531055 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/663dea5b-3cc7-4c28-9803-302e771b8556-tls-assets\") pod \"663dea5b-3cc7-4c28-9803-302e771b8556\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.531138 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-config\") pod \"663dea5b-3cc7-4c28-9803-302e771b8556\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.531227 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-web-config\") pod \"663dea5b-3cc7-4c28-9803-302e771b8556\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.531263 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/663dea5b-3cc7-4c28-9803-302e771b8556-prometheus-metric-storage-rulefiles-0\") pod \"663dea5b-3cc7-4c28-9803-302e771b8556\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.531287 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/663dea5b-3cc7-4c28-9803-302e771b8556-config-out\") pod \"663dea5b-3cc7-4c28-9803-302e771b8556\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.531350 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-thanos-prometheus-http-client-file\") pod \"663dea5b-3cc7-4c28-9803-302e771b8556\" (UID: \"663dea5b-3cc7-4c28-9803-302e771b8556\") " Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.533639 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/663dea5b-3cc7-4c28-9803-302e771b8556-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "663dea5b-3cc7-4c28-9803-302e771b8556" (UID: "663dea5b-3cc7-4c28-9803-302e771b8556"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.538206 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/663dea5b-3cc7-4c28-9803-302e771b8556-config-out" (OuterVolumeSpecName: "config-out") pod "663dea5b-3cc7-4c28-9803-302e771b8556" (UID: "663dea5b-3cc7-4c28-9803-302e771b8556"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.538611 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/663dea5b-3cc7-4c28-9803-302e771b8556-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "663dea5b-3cc7-4c28-9803-302e771b8556" (UID: "663dea5b-3cc7-4c28-9803-302e771b8556"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.538640 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/663dea5b-3cc7-4c28-9803-302e771b8556-kube-api-access-cpjlh" (OuterVolumeSpecName: "kube-api-access-cpjlh") pod "663dea5b-3cc7-4c28-9803-302e771b8556" (UID: "663dea5b-3cc7-4c28-9803-302e771b8556"). InnerVolumeSpecName "kube-api-access-cpjlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.538750 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "663dea5b-3cc7-4c28-9803-302e771b8556" (UID: "663dea5b-3cc7-4c28-9803-302e771b8556"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.540351 4744 scope.go:117] "RemoveContainer" containerID="7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.547784 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "663dea5b-3cc7-4c28-9803-302e771b8556" (UID: "663dea5b-3cc7-4c28-9803-302e771b8556"). InnerVolumeSpecName "pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.555034 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-config" (OuterVolumeSpecName: "config") pod "663dea5b-3cc7-4c28-9803-302e771b8556" (UID: "663dea5b-3cc7-4c28-9803-302e771b8556"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.561344 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-web-config" (OuterVolumeSpecName: "web-config") pod "663dea5b-3cc7-4c28-9803-302e771b8556" (UID: "663dea5b-3cc7-4c28-9803-302e771b8556"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.610773 4744 scope.go:117] "RemoveContainer" containerID="9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.629424 4744 scope.go:117] "RemoveContainer" containerID="08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1" Dec 05 20:31:32 crc kubenswrapper[4744]: E1205 20:31:32.629692 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1\": container with ID starting with 08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1 not found: ID does not exist" containerID="08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.629716 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1"} err="failed to get container status \"08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1\": rpc error: code = NotFound desc = could not find container \"08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1\": container with ID starting with 08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1 not found: ID does not exist" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.629737 4744 scope.go:117] "RemoveContainer" containerID="235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a" Dec 05 20:31:32 crc kubenswrapper[4744]: E1205 20:31:32.630035 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a\": container with ID starting with 235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a not found: ID does not exist" containerID="235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.630058 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a"} err="failed to get container status \"235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a\": rpc error: code = NotFound desc = could not find container \"235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a\": container with ID starting with 235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a not found: ID does not exist" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.630072 4744 scope.go:117] "RemoveContainer" containerID="7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b" Dec 05 20:31:32 crc kubenswrapper[4744]: E1205 20:31:32.630382 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b\": container with ID starting with 7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b not found: ID does not exist" containerID="7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.630396 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b"} err="failed to get container status \"7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b\": rpc error: code = NotFound desc = could not find container \"7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b\": container with ID starting with 7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b not found: ID does not exist" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.630408 4744 scope.go:117] "RemoveContainer" containerID="9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c" Dec 05 20:31:32 crc kubenswrapper[4744]: E1205 20:31:32.630618 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c\": container with ID starting with 9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c not found: ID does not exist" containerID="9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.630633 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c"} err="failed to get container status \"9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c\": rpc error: code = NotFound desc = could not find container \"9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c\": container with ID starting with 9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c not found: ID does not exist" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.630644 4744 scope.go:117] "RemoveContainer" containerID="08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.630868 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1"} err="failed to get container status \"08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1\": rpc error: code = NotFound desc = could not find container \"08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1\": container with ID starting with 08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1 not found: ID does not exist" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.630884 4744 scope.go:117] "RemoveContainer" containerID="235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.631096 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a"} err="failed to get container status \"235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a\": rpc error: code = NotFound desc = could not find container \"235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a\": container with ID starting with 235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a not found: ID does not exist" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.631202 4744 scope.go:117] "RemoveContainer" containerID="7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.631711 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b"} err="failed to get container status \"7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b\": rpc error: code = NotFound desc = could not find container \"7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b\": container with ID starting with 7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b not found: ID does not exist" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.631728 4744 scope.go:117] "RemoveContainer" containerID="9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.631978 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c"} err="failed to get container status \"9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c\": rpc error: code = NotFound desc = could not find container \"9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c\": container with ID starting with 9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c not found: ID does not exist" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.631999 4744 scope.go:117] "RemoveContainer" containerID="08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.632310 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1"} err="failed to get container status \"08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1\": rpc error: code = NotFound desc = could not find container \"08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1\": container with ID starting with 08ba709d59f1ce5938e01137933e3ec21b397622f19bf934c3927eb9b9bd23b1 not found: ID does not exist" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.632363 4744 scope.go:117] "RemoveContainer" containerID="235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.632620 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a"} err="failed to get container status \"235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a\": rpc error: code = NotFound desc = could not find container \"235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a\": container with ID starting with 235f56225a171dac572484accb6219ea78448cb54d06c4ce5b7f4a3d6d2b3a2a not found: ID does not exist" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.632636 4744 scope.go:117] "RemoveContainer" containerID="7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.632823 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b"} err="failed to get container status \"7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b\": rpc error: code = NotFound desc = could not find container \"7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b\": container with ID starting with 7cdd6cb1818e0d14ff61d43681e13bb718c4373a8970d8f74b521fddabd9c64b not found: ID does not exist" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.632839 4744 scope.go:117] "RemoveContainer" containerID="9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.633032 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c"} err="failed to get container status \"9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c\": rpc error: code = NotFound desc = could not find container \"9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c\": container with ID starting with 9d4e554daf7f85504a59e1d72eeefb2eb5c1a97d337a717f37a049a6fa5f6e5c not found: ID does not exist" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.633282 4744 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.633419 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\") on node \"crc\" " Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.633480 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpjlh\" (UniqueName: \"kubernetes.io/projected/663dea5b-3cc7-4c28-9803-302e771b8556-kube-api-access-cpjlh\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.633536 4744 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/663dea5b-3cc7-4c28-9803-302e771b8556-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.633601 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.633659 4744 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/663dea5b-3cc7-4c28-9803-302e771b8556-web-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.633717 4744 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/663dea5b-3cc7-4c28-9803-302e771b8556-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.633772 4744 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/663dea5b-3cc7-4c28-9803-302e771b8556-config-out\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.649411 4744 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.649567 4744 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c") on node "crc" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.735784 4744 reconciler_common.go:293] "Volume detached for volume \"pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.817927 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.824394 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.846320 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 20:31:32 crc kubenswrapper[4744]: E1205 20:31:32.846786 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="663dea5b-3cc7-4c28-9803-302e771b8556" containerName="thanos-sidecar" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.846804 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="663dea5b-3cc7-4c28-9803-302e771b8556" containerName="thanos-sidecar" Dec 05 20:31:32 crc kubenswrapper[4744]: E1205 20:31:32.846881 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="663dea5b-3cc7-4c28-9803-302e771b8556" containerName="prometheus" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.846892 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="663dea5b-3cc7-4c28-9803-302e771b8556" containerName="prometheus" Dec 05 20:31:32 crc kubenswrapper[4744]: E1205 20:31:32.846917 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="663dea5b-3cc7-4c28-9803-302e771b8556" containerName="init-config-reloader" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.846940 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="663dea5b-3cc7-4c28-9803-302e771b8556" containerName="init-config-reloader" Dec 05 20:31:32 crc kubenswrapper[4744]: E1205 20:31:32.846955 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="663dea5b-3cc7-4c28-9803-302e771b8556" containerName="config-reloader" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.846963 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="663dea5b-3cc7-4c28-9803-302e771b8556" containerName="config-reloader" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.847165 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="663dea5b-3cc7-4c28-9803-302e771b8556" containerName="thanos-sidecar" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.847191 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="663dea5b-3cc7-4c28-9803-302e771b8556" containerName="config-reloader" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.847202 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="663dea5b-3cc7-4c28-9803-302e771b8556" containerName="prometheus" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.853888 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.855731 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.868774 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-web-config" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.868773 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.870302 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-prometheus-dockercfg-pzqcl" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.870401 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-0" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.870613 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-metric-storage-prometheus-svc" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.870790 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.885695 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-tls-assets-0" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.938976 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.939060 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/be427b22-e361-4be4-8eec-bb2b4be47296-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.939085 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.939114 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.939157 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.939192 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/be427b22-e361-4be4-8eec-bb2b4be47296-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.939213 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/be427b22-e361-4be4-8eec-bb2b4be47296-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.939236 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.939258 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-config\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.939273 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsp7r\" (UniqueName: \"kubernetes.io/projected/be427b22-e361-4be4-8eec-bb2b4be47296-kube-api-access-vsp7r\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:32 crc kubenswrapper[4744]: I1205 20:31:32.939305 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.040385 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.040445 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.040496 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/be427b22-e361-4be4-8eec-bb2b4be47296-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.040526 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/be427b22-e361-4be4-8eec-bb2b4be47296-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.040558 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.040586 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-config\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.040614 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.040634 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsp7r\" (UniqueName: \"kubernetes.io/projected/be427b22-e361-4be4-8eec-bb2b4be47296-kube-api-access-vsp7r\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.040696 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.040731 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/be427b22-e361-4be4-8eec-bb2b4be47296-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.040757 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.042655 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/be427b22-e361-4be4-8eec-bb2b4be47296-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.044496 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.045583 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.046127 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/be427b22-e361-4be4-8eec-bb2b4be47296-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.046635 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.047099 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.047425 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.048588 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.048626 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6ddf1f7acec4066f68d4ea259a334af34aba683a37b9f2a47c0ede5bd328023f/globalmount\"" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.049050 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/be427b22-e361-4be4-8eec-bb2b4be47296-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.051436 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/be427b22-e361-4be4-8eec-bb2b4be47296-config\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.065339 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsp7r\" (UniqueName: \"kubernetes.io/projected/be427b22-e361-4be4-8eec-bb2b4be47296-kube-api-access-vsp7r\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.090027 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-292ec81c-9a0a-49dd-93ae-b0c211190c2c\") pod \"prometheus-metric-storage-0\" (UID: \"be427b22-e361-4be4-8eec-bb2b4be47296\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.170713 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.630191 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 05 20:31:33 crc kubenswrapper[4744]: W1205 20:31:33.632934 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe427b22_e361_4be4_8eec_bb2b4be47296.slice/crio-e1ed78440ecd531329446bff5b436603e5f76e644ea696445feac8d8056ba8bc WatchSource:0}: Error finding container e1ed78440ecd531329446bff5b436603e5f76e644ea696445feac8d8056ba8bc: Status 404 returned error can't find the container with id e1ed78440ecd531329446bff5b436603e5f76e644ea696445feac8d8056ba8bc Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.801477 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-d357-account-create-update-g66fz" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.820101 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-c2ltm" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.853107 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljrh9\" (UniqueName: \"kubernetes.io/projected/adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08-kube-api-access-ljrh9\") pod \"adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08\" (UID: \"adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08\") " Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.853219 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzvfp\" (UniqueName: \"kubernetes.io/projected/cd5ce9df-fda1-446b-a224-9f4d1a93dc47-kube-api-access-fzvfp\") pod \"cd5ce9df-fda1-446b-a224-9f4d1a93dc47\" (UID: \"cd5ce9df-fda1-446b-a224-9f4d1a93dc47\") " Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.853285 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd5ce9df-fda1-446b-a224-9f4d1a93dc47-operator-scripts\") pod \"cd5ce9df-fda1-446b-a224-9f4d1a93dc47\" (UID: \"cd5ce9df-fda1-446b-a224-9f4d1a93dc47\") " Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.853354 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08-operator-scripts\") pod \"adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08\" (UID: \"adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08\") " Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.854234 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd5ce9df-fda1-446b-a224-9f4d1a93dc47-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd5ce9df-fda1-446b-a224-9f4d1a93dc47" (UID: "cd5ce9df-fda1-446b-a224-9f4d1a93dc47"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.854234 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08" (UID: "adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.856756 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd5ce9df-fda1-446b-a224-9f4d1a93dc47-kube-api-access-fzvfp" (OuterVolumeSpecName: "kube-api-access-fzvfp") pod "cd5ce9df-fda1-446b-a224-9f4d1a93dc47" (UID: "cd5ce9df-fda1-446b-a224-9f4d1a93dc47"). InnerVolumeSpecName "kube-api-access-fzvfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.856997 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08-kube-api-access-ljrh9" (OuterVolumeSpecName: "kube-api-access-ljrh9") pod "adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08" (UID: "adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08"). InnerVolumeSpecName "kube-api-access-ljrh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.956657 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzvfp\" (UniqueName: \"kubernetes.io/projected/cd5ce9df-fda1-446b-a224-9f4d1a93dc47-kube-api-access-fzvfp\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.956698 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd5ce9df-fda1-446b-a224-9f4d1a93dc47-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.956709 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:33 crc kubenswrapper[4744]: I1205 20:31:33.956720 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljrh9\" (UniqueName: \"kubernetes.io/projected/adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08-kube-api-access-ljrh9\") on node \"crc\" DevicePath \"\"" Dec 05 20:31:34 crc kubenswrapper[4744]: I1205 20:31:34.091778 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="663dea5b-3cc7-4c28-9803-302e771b8556" path="/var/lib/kubelet/pods/663dea5b-3cc7-4c28-9803-302e771b8556/volumes" Dec 05 20:31:34 crc kubenswrapper[4744]: I1205 20:31:34.512025 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-c2ltm" event={"ID":"cd5ce9df-fda1-446b-a224-9f4d1a93dc47","Type":"ContainerDied","Data":"20a6c4c9e6b2f6e58f263c511ce1bcc9fcd35a36d779e5d1f788324398ded73f"} Dec 05 20:31:34 crc kubenswrapper[4744]: I1205 20:31:34.512334 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20a6c4c9e6b2f6e58f263c511ce1bcc9fcd35a36d779e5d1f788324398ded73f" Dec 05 20:31:34 crc kubenswrapper[4744]: I1205 20:31:34.512370 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-c2ltm" Dec 05 20:31:34 crc kubenswrapper[4744]: I1205 20:31:34.513553 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"be427b22-e361-4be4-8eec-bb2b4be47296","Type":"ContainerStarted","Data":"e1ed78440ecd531329446bff5b436603e5f76e644ea696445feac8d8056ba8bc"} Dec 05 20:31:34 crc kubenswrapper[4744]: I1205 20:31:34.516471 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-d357-account-create-update-g66fz" event={"ID":"adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08","Type":"ContainerDied","Data":"52fd1f2d25acdfca892841572113fdbfe583c81f77ecc936d34ce6d2e5d40c1e"} Dec 05 20:31:34 crc kubenswrapper[4744]: I1205 20:31:34.516531 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52fd1f2d25acdfca892841572113fdbfe583c81f77ecc936d34ce6d2e5d40c1e" Dec 05 20:31:34 crc kubenswrapper[4744]: I1205 20:31:34.516653 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-d357-account-create-update-g66fz" Dec 05 20:31:36 crc kubenswrapper[4744]: I1205 20:31:36.540525 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"be427b22-e361-4be4-8eec-bb2b4be47296","Type":"ContainerStarted","Data":"144c1f3ec717715a4381047198b79394283805acd5a1204e0cc39df95cf58be8"} Dec 05 20:31:42 crc kubenswrapper[4744]: I1205 20:31:42.593818 4744 generic.go:334] "Generic (PLEG): container finished" podID="c4dae229-7a1c-4eb8-8932-7fd75e348bb2" containerID="cc25a9e9d72d30ac5f2d89cc76735f430e59a33cd73780212408ec6dd28dc517" exitCode=0 Dec 05 20:31:42 crc kubenswrapper[4744]: I1205 20:31:42.593888 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"c4dae229-7a1c-4eb8-8932-7fd75e348bb2","Type":"ContainerDied","Data":"cc25a9e9d72d30ac5f2d89cc76735f430e59a33cd73780212408ec6dd28dc517"} Dec 05 20:31:42 crc kubenswrapper[4744]: I1205 20:31:42.598020 4744 generic.go:334] "Generic (PLEG): container finished" podID="cfb456f7-66c1-4493-85d4-bae3322914f9" containerID="6b850b78d21075d7781357596ff60640cd71cc5ea27699baeb128222a1c29db5" exitCode=0 Dec 05 20:31:42 crc kubenswrapper[4744]: I1205 20:31:42.598105 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"cfb456f7-66c1-4493-85d4-bae3322914f9","Type":"ContainerDied","Data":"6b850b78d21075d7781357596ff60640cd71cc5ea27699baeb128222a1c29db5"} Dec 05 20:31:43 crc kubenswrapper[4744]: I1205 20:31:43.609647 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"c4dae229-7a1c-4eb8-8932-7fd75e348bb2","Type":"ContainerStarted","Data":"cffe26e0eff8d5173005fe18db9c7dc024a644bb9a1381b629c357df07508a67"} Dec 05 20:31:43 crc kubenswrapper[4744]: I1205 20:31:43.610241 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:31:43 crc kubenswrapper[4744]: I1205 20:31:43.612609 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"cfb456f7-66c1-4493-85d4-bae3322914f9","Type":"ContainerStarted","Data":"5a2c17eaa77fe6a5c6db738b85d37941279f9d0f6b5621a8b5aec14e283c4fb0"} Dec 05 20:31:43 crc kubenswrapper[4744]: I1205 20:31:43.612781 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:31:43 crc kubenswrapper[4744]: I1205 20:31:43.643161 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" podStartSLOduration=37.479945946 podStartE2EDuration="1m6.643142474s" podCreationTimestamp="2025-12-05 20:30:37 +0000 UTC" firstStartedPulling="2025-12-05 20:30:39.371879975 +0000 UTC m=+1209.601691343" lastFinishedPulling="2025-12-05 20:31:08.535076503 +0000 UTC m=+1238.764887871" observedRunningTime="2025-12-05 20:31:43.636404547 +0000 UTC m=+1273.866215925" watchObservedRunningTime="2025-12-05 20:31:43.643142474 +0000 UTC m=+1273.872953842" Dec 05 20:31:43 crc kubenswrapper[4744]: I1205 20:31:43.669425 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/rabbitmq-server-0" podStartSLOduration=37.090015482 podStartE2EDuration="1m5.669400713s" podCreationTimestamp="2025-12-05 20:30:38 +0000 UTC" firstStartedPulling="2025-12-05 20:30:39.951017967 +0000 UTC m=+1210.180829335" lastFinishedPulling="2025-12-05 20:31:08.530403188 +0000 UTC m=+1238.760214566" observedRunningTime="2025-12-05 20:31:43.659593261 +0000 UTC m=+1273.889404659" watchObservedRunningTime="2025-12-05 20:31:43.669400713 +0000 UTC m=+1273.899212091" Dec 05 20:31:44 crc kubenswrapper[4744]: I1205 20:31:44.623279 4744 generic.go:334] "Generic (PLEG): container finished" podID="be427b22-e361-4be4-8eec-bb2b4be47296" containerID="144c1f3ec717715a4381047198b79394283805acd5a1204e0cc39df95cf58be8" exitCode=0 Dec 05 20:31:44 crc kubenswrapper[4744]: I1205 20:31:44.623360 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"be427b22-e361-4be4-8eec-bb2b4be47296","Type":"ContainerDied","Data":"144c1f3ec717715a4381047198b79394283805acd5a1204e0cc39df95cf58be8"} Dec 05 20:31:45 crc kubenswrapper[4744]: I1205 20:31:45.644055 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"be427b22-e361-4be4-8eec-bb2b4be47296","Type":"ContainerStarted","Data":"3493c2a49defb1c34c614988db960b6cdab0fd196e6d9f52190af48ff265dca3"} Dec 05 20:31:47 crc kubenswrapper[4744]: I1205 20:31:47.663658 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"be427b22-e361-4be4-8eec-bb2b4be47296","Type":"ContainerStarted","Data":"38685cb8cc3fb83714ba742d654678aa3632bddf864d282cd1c8d1e0efa811da"} Dec 05 20:31:47 crc kubenswrapper[4744]: I1205 20:31:47.664152 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"be427b22-e361-4be4-8eec-bb2b4be47296","Type":"ContainerStarted","Data":"773296fdc1c6773fdf9075836e835e538f40e8bfb5317aed0e3cf2ce5135f444"} Dec 05 20:31:47 crc kubenswrapper[4744]: I1205 20:31:47.696108 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/prometheus-metric-storage-0" podStartSLOduration=15.696088953 podStartE2EDuration="15.696088953s" podCreationTimestamp="2025-12-05 20:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:31:47.689019959 +0000 UTC m=+1277.918831337" watchObservedRunningTime="2025-12-05 20:31:47.696088953 +0000 UTC m=+1277.925900321" Dec 05 20:31:48 crc kubenswrapper[4744]: I1205 20:31:48.172332 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:48 crc kubenswrapper[4744]: I1205 20:31:48.172371 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:48 crc kubenswrapper[4744]: I1205 20:31:48.178226 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:48 crc kubenswrapper[4744]: I1205 20:31:48.676660 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 05 20:31:49 crc kubenswrapper[4744]: I1205 20:31:49.807216 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:31:49 crc kubenswrapper[4744]: I1205 20:31:49.807844 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:31:49 crc kubenswrapper[4744]: I1205 20:31:49.807941 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:31:49 crc kubenswrapper[4744]: I1205 20:31:49.808830 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52a7f6284055fc7f936355b093cc061c593ac88f5c9486e893ae19c6a9299d8d"} pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:31:49 crc kubenswrapper[4744]: I1205 20:31:49.808924 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" containerID="cri-o://52a7f6284055fc7f936355b093cc061c593ac88f5c9486e893ae19c6a9299d8d" gracePeriod=600 Dec 05 20:31:50 crc kubenswrapper[4744]: I1205 20:31:50.693471 4744 generic.go:334] "Generic (PLEG): container finished" podID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerID="52a7f6284055fc7f936355b093cc061c593ac88f5c9486e893ae19c6a9299d8d" exitCode=0 Dec 05 20:31:50 crc kubenswrapper[4744]: I1205 20:31:50.693533 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerDied","Data":"52a7f6284055fc7f936355b093cc061c593ac88f5c9486e893ae19c6a9299d8d"} Dec 05 20:31:50 crc kubenswrapper[4744]: I1205 20:31:50.693586 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerStarted","Data":"a41e1afd711ac794442abac71b281086d9f7a27b011779b1513b0d659dd4277c"} Dec 05 20:31:50 crc kubenswrapper[4744]: I1205 20:31:50.693604 4744 scope.go:117] "RemoveContainer" containerID="7361719f1aaa6a0025abf0bbccc7737602f9bbc3dfb06fc01d1de9cb17c502bc" Dec 05 20:31:58 crc kubenswrapper[4744]: I1205 20:31:58.896367 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 05 20:31:59 crc kubenswrapper[4744]: I1205 20:31:59.495070 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/rabbitmq-server-0" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.230552 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-db-sync-mrwcv"] Dec 05 20:32:01 crc kubenswrapper[4744]: E1205 20:32:01.231457 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5ce9df-fda1-446b-a224-9f4d1a93dc47" containerName="mariadb-database-create" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.231475 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5ce9df-fda1-446b-a224-9f4d1a93dc47" containerName="mariadb-database-create" Dec 05 20:32:01 crc kubenswrapper[4744]: E1205 20:32:01.231490 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08" containerName="mariadb-account-create-update" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.231498 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08" containerName="mariadb-account-create-update" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.231679 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08" containerName="mariadb-account-create-update" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.231695 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd5ce9df-fda1-446b-a224-9f4d1a93dc47" containerName="mariadb-database-create" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.232443 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-mrwcv" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.239400 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-mrwcv"] Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.333121 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-24zdw" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.333417 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.334417 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.334471 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.435337 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-combined-ca-bundle\") pod \"keystone-db-sync-mrwcv\" (UID: \"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f\") " pod="watcher-kuttl-default/keystone-db-sync-mrwcv" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.435380 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf8hj\" (UniqueName: \"kubernetes.io/projected/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-kube-api-access-pf8hj\") pod \"keystone-db-sync-mrwcv\" (UID: \"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f\") " pod="watcher-kuttl-default/keystone-db-sync-mrwcv" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.435931 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-config-data\") pod \"keystone-db-sync-mrwcv\" (UID: \"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f\") " pod="watcher-kuttl-default/keystone-db-sync-mrwcv" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.537411 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf8hj\" (UniqueName: \"kubernetes.io/projected/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-kube-api-access-pf8hj\") pod \"keystone-db-sync-mrwcv\" (UID: \"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f\") " pod="watcher-kuttl-default/keystone-db-sync-mrwcv" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.537759 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-config-data\") pod \"keystone-db-sync-mrwcv\" (UID: \"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f\") " pod="watcher-kuttl-default/keystone-db-sync-mrwcv" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.537934 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-combined-ca-bundle\") pod \"keystone-db-sync-mrwcv\" (UID: \"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f\") " pod="watcher-kuttl-default/keystone-db-sync-mrwcv" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.543066 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-combined-ca-bundle\") pod \"keystone-db-sync-mrwcv\" (UID: \"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f\") " pod="watcher-kuttl-default/keystone-db-sync-mrwcv" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.547147 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-config-data\") pod \"keystone-db-sync-mrwcv\" (UID: \"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f\") " pod="watcher-kuttl-default/keystone-db-sync-mrwcv" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.567655 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf8hj\" (UniqueName: \"kubernetes.io/projected/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-kube-api-access-pf8hj\") pod \"keystone-db-sync-mrwcv\" (UID: \"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f\") " pod="watcher-kuttl-default/keystone-db-sync-mrwcv" Dec 05 20:32:01 crc kubenswrapper[4744]: I1205 20:32:01.670975 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-mrwcv" Dec 05 20:32:02 crc kubenswrapper[4744]: I1205 20:32:02.126781 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-mrwcv"] Dec 05 20:32:02 crc kubenswrapper[4744]: I1205 20:32:02.830031 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-mrwcv" event={"ID":"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f","Type":"ContainerStarted","Data":"6cb2a897864ab584d0e1e9d894c196316bbc0c1b1eaba57370377d4488b8b542"} Dec 05 20:32:09 crc kubenswrapper[4744]: I1205 20:32:09.889914 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-mrwcv" event={"ID":"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f","Type":"ContainerStarted","Data":"33e5c90f8e50d62e3036d65b4ce2f67a2477301cd44398ba9da4ffd2ca6283be"} Dec 05 20:32:09 crc kubenswrapper[4744]: I1205 20:32:09.912608 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-db-sync-mrwcv" podStartSLOduration=1.511693596 podStartE2EDuration="8.912593065s" podCreationTimestamp="2025-12-05 20:32:01 +0000 UTC" firstStartedPulling="2025-12-05 20:32:02.137836092 +0000 UTC m=+1292.367647460" lastFinishedPulling="2025-12-05 20:32:09.538735561 +0000 UTC m=+1299.768546929" observedRunningTime="2025-12-05 20:32:09.908059864 +0000 UTC m=+1300.137871242" watchObservedRunningTime="2025-12-05 20:32:09.912593065 +0000 UTC m=+1300.142404453" Dec 05 20:32:13 crc kubenswrapper[4744]: I1205 20:32:13.922864 4744 generic.go:334] "Generic (PLEG): container finished" podID="f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f" containerID="33e5c90f8e50d62e3036d65b4ce2f67a2477301cd44398ba9da4ffd2ca6283be" exitCode=0 Dec 05 20:32:13 crc kubenswrapper[4744]: I1205 20:32:13.922953 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-mrwcv" event={"ID":"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f","Type":"ContainerDied","Data":"33e5c90f8e50d62e3036d65b4ce2f67a2477301cd44398ba9da4ffd2ca6283be"} Dec 05 20:32:15 crc kubenswrapper[4744]: I1205 20:32:15.248620 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-mrwcv" Dec 05 20:32:15 crc kubenswrapper[4744]: I1205 20:32:15.336406 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf8hj\" (UniqueName: \"kubernetes.io/projected/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-kube-api-access-pf8hj\") pod \"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f\" (UID: \"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f\") " Dec 05 20:32:15 crc kubenswrapper[4744]: I1205 20:32:15.336510 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-config-data\") pod \"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f\" (UID: \"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f\") " Dec 05 20:32:15 crc kubenswrapper[4744]: I1205 20:32:15.336572 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-combined-ca-bundle\") pod \"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f\" (UID: \"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f\") " Dec 05 20:32:15 crc kubenswrapper[4744]: I1205 20:32:15.345560 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-kube-api-access-pf8hj" (OuterVolumeSpecName: "kube-api-access-pf8hj") pod "f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f" (UID: "f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f"). InnerVolumeSpecName "kube-api-access-pf8hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:15 crc kubenswrapper[4744]: I1205 20:32:15.364451 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f" (UID: "f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:15 crc kubenswrapper[4744]: I1205 20:32:15.386192 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-config-data" (OuterVolumeSpecName: "config-data") pod "f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f" (UID: "f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:15 crc kubenswrapper[4744]: I1205 20:32:15.438165 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:15 crc kubenswrapper[4744]: I1205 20:32:15.438200 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf8hj\" (UniqueName: \"kubernetes.io/projected/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-kube-api-access-pf8hj\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:15 crc kubenswrapper[4744]: I1205 20:32:15.438211 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:15 crc kubenswrapper[4744]: I1205 20:32:15.941890 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-mrwcv" event={"ID":"f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f","Type":"ContainerDied","Data":"6cb2a897864ab584d0e1e9d894c196316bbc0c1b1eaba57370377d4488b8b542"} Dec 05 20:32:15 crc kubenswrapper[4744]: I1205 20:32:15.941925 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cb2a897864ab584d0e1e9d894c196316bbc0c1b1eaba57370377d4488b8b542" Dec 05 20:32:15 crc kubenswrapper[4744]: I1205 20:32:15.942003 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-mrwcv" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.149904 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-84vkk"] Dec 05 20:32:16 crc kubenswrapper[4744]: E1205 20:32:16.150276 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f" containerName="keystone-db-sync" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.150311 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f" containerName="keystone-db-sync" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.150517 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f" containerName="keystone-db-sync" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.151168 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.153078 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"osp-secret" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.153384 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-24zdw" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.153413 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.153456 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.153484 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.170309 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-84vkk"] Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.248090 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-fernet-keys\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.248237 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-combined-ca-bundle\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.248351 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssxgh\" (UniqueName: \"kubernetes.io/projected/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-kube-api-access-ssxgh\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.248999 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-credential-keys\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.249150 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-config-data\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.249266 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-scripts\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.283946 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.286273 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.288994 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.295110 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.302576 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.350225 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssxgh\" (UniqueName: \"kubernetes.io/projected/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-kube-api-access-ssxgh\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.350280 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-credential-keys\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.350319 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-config-data\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.350366 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-scripts\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.350406 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-fernet-keys\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.350608 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-combined-ca-bundle\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.354396 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-credential-keys\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.354939 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-config-data\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.355651 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-scripts\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.366934 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-fernet-keys\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.366944 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-combined-ca-bundle\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.390874 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssxgh\" (UniqueName: \"kubernetes.io/projected/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-kube-api-access-ssxgh\") pod \"keystone-bootstrap-84vkk\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.453691 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70b7fe68-7fe4-40ed-878c-f07405b97069-log-httpd\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.453743 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.453770 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qbgv\" (UniqueName: \"kubernetes.io/projected/70b7fe68-7fe4-40ed-878c-f07405b97069-kube-api-access-5qbgv\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.453797 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-scripts\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.453834 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.453862 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-config-data\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.453884 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70b7fe68-7fe4-40ed-878c-f07405b97069-run-httpd\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.471817 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.558163 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70b7fe68-7fe4-40ed-878c-f07405b97069-run-httpd\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.558263 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70b7fe68-7fe4-40ed-878c-f07405b97069-log-httpd\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.558283 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.558322 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qbgv\" (UniqueName: \"kubernetes.io/projected/70b7fe68-7fe4-40ed-878c-f07405b97069-kube-api-access-5qbgv\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.558350 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-scripts\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.558381 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.558405 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-config-data\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.563812 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70b7fe68-7fe4-40ed-878c-f07405b97069-run-httpd\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.564049 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70b7fe68-7fe4-40ed-878c-f07405b97069-log-httpd\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.571056 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-scripts\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.579018 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.579915 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-config-data\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.583191 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.614114 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qbgv\" (UniqueName: \"kubernetes.io/projected/70b7fe68-7fe4-40ed-878c-f07405b97069-kube-api-access-5qbgv\") pod \"ceilometer-0\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:16 crc kubenswrapper[4744]: I1205 20:32:16.904954 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:17 crc kubenswrapper[4744]: I1205 20:32:17.202790 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-84vkk"] Dec 05 20:32:17 crc kubenswrapper[4744]: I1205 20:32:17.389312 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:32:17 crc kubenswrapper[4744]: I1205 20:32:17.968555 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-84vkk" event={"ID":"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f","Type":"ContainerStarted","Data":"05f3ef277b461a067cfefffc82d29ef4ae612f4d0bb09ce1b4ba2623bc968e6e"} Dec 05 20:32:17 crc kubenswrapper[4744]: I1205 20:32:17.975587 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70b7fe68-7fe4-40ed-878c-f07405b97069","Type":"ContainerStarted","Data":"65dec2fe5a00cfe2b9996274dfa404a0c541b8fbef3823a1be4c667c26b30ef2"} Dec 05 20:32:18 crc kubenswrapper[4744]: I1205 20:32:18.226177 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:32:18 crc kubenswrapper[4744]: I1205 20:32:18.990111 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-84vkk" event={"ID":"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f","Type":"ContainerStarted","Data":"eedd081d5a2588b462850249437846282ad7b3ebf125c6a4ec001312d4153d7b"} Dec 05 20:32:19 crc kubenswrapper[4744]: I1205 20:32:19.014628 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-84vkk" podStartSLOduration=3.014429569 podStartE2EDuration="3.014429569s" podCreationTimestamp="2025-12-05 20:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:32:19.007665714 +0000 UTC m=+1309.237477082" watchObservedRunningTime="2025-12-05 20:32:19.014429569 +0000 UTC m=+1309.244240937" Dec 05 20:32:23 crc kubenswrapper[4744]: I1205 20:32:23.033861 4744 generic.go:334] "Generic (PLEG): container finished" podID="e21d43dc-a6d6-4e5f-8596-5e0a9968d35f" containerID="eedd081d5a2588b462850249437846282ad7b3ebf125c6a4ec001312d4153d7b" exitCode=0 Dec 05 20:32:23 crc kubenswrapper[4744]: I1205 20:32:23.033992 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-84vkk" event={"ID":"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f","Type":"ContainerDied","Data":"eedd081d5a2588b462850249437846282ad7b3ebf125c6a4ec001312d4153d7b"} Dec 05 20:32:23 crc kubenswrapper[4744]: I1205 20:32:23.035926 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70b7fe68-7fe4-40ed-878c-f07405b97069","Type":"ContainerStarted","Data":"cf754adf40cb03ade5417a95909955bac3c042fa7f36551489859a2f6a3c6733"} Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.045079 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70b7fe68-7fe4-40ed-878c-f07405b97069","Type":"ContainerStarted","Data":"742c146e9cd7a5703b7573153e10aa089db73befa020a857a311d6bbdd4769eb"} Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.345889 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.401857 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-scripts\") pod \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.401960 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-credential-keys\") pod \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.402019 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssxgh\" (UniqueName: \"kubernetes.io/projected/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-kube-api-access-ssxgh\") pod \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.402069 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-config-data\") pod \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.402097 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-fernet-keys\") pod \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.402166 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-combined-ca-bundle\") pod \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\" (UID: \"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f\") " Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.415152 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-scripts" (OuterVolumeSpecName: "scripts") pod "e21d43dc-a6d6-4e5f-8596-5e0a9968d35f" (UID: "e21d43dc-a6d6-4e5f-8596-5e0a9968d35f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.415258 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e21d43dc-a6d6-4e5f-8596-5e0a9968d35f" (UID: "e21d43dc-a6d6-4e5f-8596-5e0a9968d35f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.415251 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-kube-api-access-ssxgh" (OuterVolumeSpecName: "kube-api-access-ssxgh") pod "e21d43dc-a6d6-4e5f-8596-5e0a9968d35f" (UID: "e21d43dc-a6d6-4e5f-8596-5e0a9968d35f"). InnerVolumeSpecName "kube-api-access-ssxgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.433256 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e21d43dc-a6d6-4e5f-8596-5e0a9968d35f" (UID: "e21d43dc-a6d6-4e5f-8596-5e0a9968d35f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.441809 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-config-data" (OuterVolumeSpecName: "config-data") pod "e21d43dc-a6d6-4e5f-8596-5e0a9968d35f" (UID: "e21d43dc-a6d6-4e5f-8596-5e0a9968d35f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.462941 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e21d43dc-a6d6-4e5f-8596-5e0a9968d35f" (UID: "e21d43dc-a6d6-4e5f-8596-5e0a9968d35f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.504353 4744 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.504409 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.504420 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.504429 4744 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.504438 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssxgh\" (UniqueName: \"kubernetes.io/projected/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-kube-api-access-ssxgh\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:24 crc kubenswrapper[4744]: I1205 20:32:24.504449 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.075074 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-84vkk" event={"ID":"e21d43dc-a6d6-4e5f-8596-5e0a9968d35f","Type":"ContainerDied","Data":"05f3ef277b461a067cfefffc82d29ef4ae612f4d0bb09ce1b4ba2623bc968e6e"} Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.075119 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05f3ef277b461a067cfefffc82d29ef4ae612f4d0bb09ce1b4ba2623bc968e6e" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.075191 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-84vkk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.130251 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-84vkk"] Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.132625 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-84vkk"] Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.221158 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-8fnjk"] Dec 05 20:32:25 crc kubenswrapper[4744]: E1205 20:32:25.221544 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e21d43dc-a6d6-4e5f-8596-5e0a9968d35f" containerName="keystone-bootstrap" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.221568 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21d43dc-a6d6-4e5f-8596-5e0a9968d35f" containerName="keystone-bootstrap" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.221793 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e21d43dc-a6d6-4e5f-8596-5e0a9968d35f" containerName="keystone-bootstrap" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.222575 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.226779 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"osp-secret" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.227067 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.227097 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.227332 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-24zdw" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.227439 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.255502 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-8fnjk"] Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.325100 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm8q4\" (UniqueName: \"kubernetes.io/projected/b7db7edb-cbd0-485c-89c5-8f621cdf47df-kube-api-access-gm8q4\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.325221 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-combined-ca-bundle\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.325322 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-config-data\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.325345 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-scripts\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.325445 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-credential-keys\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.325529 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-fernet-keys\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.426935 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-combined-ca-bundle\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.427034 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-config-data\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.427079 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-scripts\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.427136 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-credential-keys\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.427214 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-fernet-keys\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.427275 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm8q4\" (UniqueName: \"kubernetes.io/projected/b7db7edb-cbd0-485c-89c5-8f621cdf47df-kube-api-access-gm8q4\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.444885 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-combined-ca-bundle\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.447228 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-config-data\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.447897 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-fernet-keys\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.448530 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm8q4\" (UniqueName: \"kubernetes.io/projected/b7db7edb-cbd0-485c-89c5-8f621cdf47df-kube-api-access-gm8q4\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.450551 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-credential-keys\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.450917 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-scripts\") pod \"keystone-bootstrap-8fnjk\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:25 crc kubenswrapper[4744]: I1205 20:32:25.552225 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:26 crc kubenswrapper[4744]: I1205 20:32:26.025898 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-8fnjk"] Dec 05 20:32:26 crc kubenswrapper[4744]: I1205 20:32:26.095800 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e21d43dc-a6d6-4e5f-8596-5e0a9968d35f" path="/var/lib/kubelet/pods/e21d43dc-a6d6-4e5f-8596-5e0a9968d35f/volumes" Dec 05 20:32:26 crc kubenswrapper[4744]: I1205 20:32:26.096380 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" event={"ID":"b7db7edb-cbd0-485c-89c5-8f621cdf47df","Type":"ContainerStarted","Data":"f9d319a895fefd939ec5bbfb56154de282bbb7b3983b9b73282f179beb165528"} Dec 05 20:32:27 crc kubenswrapper[4744]: I1205 20:32:27.108848 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" event={"ID":"b7db7edb-cbd0-485c-89c5-8f621cdf47df","Type":"ContainerStarted","Data":"dfd8a7176d2d1bbc74e20bf321b3ff236d1d9b2f3769b9708022eabed666113c"} Dec 05 20:32:27 crc kubenswrapper[4744]: I1205 20:32:27.132779 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" podStartSLOduration=2.1327621 podStartE2EDuration="2.1327621s" podCreationTimestamp="2025-12-05 20:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:32:27.127002029 +0000 UTC m=+1317.356813397" watchObservedRunningTime="2025-12-05 20:32:27.1327621 +0000 UTC m=+1317.362573468" Dec 05 20:32:35 crc kubenswrapper[4744]: I1205 20:32:35.177974 4744 generic.go:334] "Generic (PLEG): container finished" podID="b7db7edb-cbd0-485c-89c5-8f621cdf47df" containerID="dfd8a7176d2d1bbc74e20bf321b3ff236d1d9b2f3769b9708022eabed666113c" exitCode=0 Dec 05 20:32:35 crc kubenswrapper[4744]: I1205 20:32:35.178061 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" event={"ID":"b7db7edb-cbd0-485c-89c5-8f621cdf47df","Type":"ContainerDied","Data":"dfd8a7176d2d1bbc74e20bf321b3ff236d1d9b2f3769b9708022eabed666113c"} Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.185932 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70b7fe68-7fe4-40ed-878c-f07405b97069","Type":"ContainerStarted","Data":"91c7a0a49e5bc5960fa1099f09e59bac227d427432166992dd9e067673de675f"} Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.528528 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.703466 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm8q4\" (UniqueName: \"kubernetes.io/projected/b7db7edb-cbd0-485c-89c5-8f621cdf47df-kube-api-access-gm8q4\") pod \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.704503 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-credential-keys\") pod \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.704625 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-config-data\") pod \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.704691 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-combined-ca-bundle\") pod \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.704723 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-scripts\") pod \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.704749 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-fernet-keys\") pod \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\" (UID: \"b7db7edb-cbd0-485c-89c5-8f621cdf47df\") " Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.709485 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b7db7edb-cbd0-485c-89c5-8f621cdf47df" (UID: "b7db7edb-cbd0-485c-89c5-8f621cdf47df"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.710390 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b7db7edb-cbd0-485c-89c5-8f621cdf47df" (UID: "b7db7edb-cbd0-485c-89c5-8f621cdf47df"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.710692 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7db7edb-cbd0-485c-89c5-8f621cdf47df-kube-api-access-gm8q4" (OuterVolumeSpecName: "kube-api-access-gm8q4") pod "b7db7edb-cbd0-485c-89c5-8f621cdf47df" (UID: "b7db7edb-cbd0-485c-89c5-8f621cdf47df"). InnerVolumeSpecName "kube-api-access-gm8q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.711719 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-scripts" (OuterVolumeSpecName: "scripts") pod "b7db7edb-cbd0-485c-89c5-8f621cdf47df" (UID: "b7db7edb-cbd0-485c-89c5-8f621cdf47df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.728311 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7db7edb-cbd0-485c-89c5-8f621cdf47df" (UID: "b7db7edb-cbd0-485c-89c5-8f621cdf47df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.730514 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-config-data" (OuterVolumeSpecName: "config-data") pod "b7db7edb-cbd0-485c-89c5-8f621cdf47df" (UID: "b7db7edb-cbd0-485c-89c5-8f621cdf47df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.806788 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm8q4\" (UniqueName: \"kubernetes.io/projected/b7db7edb-cbd0-485c-89c5-8f621cdf47df-kube-api-access-gm8q4\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.806831 4744 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.806883 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.806897 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.806919 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:36 crc kubenswrapper[4744]: I1205 20:32:36.806935 4744 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7db7edb-cbd0-485c-89c5-8f621cdf47df-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.196489 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" event={"ID":"b7db7edb-cbd0-485c-89c5-8f621cdf47df","Type":"ContainerDied","Data":"f9d319a895fefd939ec5bbfb56154de282bbb7b3983b9b73282f179beb165528"} Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.196536 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9d319a895fefd939ec5bbfb56154de282bbb7b3983b9b73282f179beb165528" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.196544 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-8fnjk" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.298328 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6"] Dec 05 20:32:37 crc kubenswrapper[4744]: E1205 20:32:37.298924 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7db7edb-cbd0-485c-89c5-8f621cdf47df" containerName="keystone-bootstrap" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.298938 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7db7edb-cbd0-485c-89c5-8f621cdf47df" containerName="keystone-bootstrap" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.299095 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7db7edb-cbd0-485c-89c5-8f621cdf47df" containerName="keystone-bootstrap" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.299607 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.302038 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.302068 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-keystone-public-svc" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.302232 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-keystone-internal-svc" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.303611 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-24zdw" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.303842 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.303896 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.327145 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6"] Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.415225 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mmcj\" (UniqueName: \"kubernetes.io/projected/0474bea5-5db3-4b16-a280-9589048721c1-kube-api-access-2mmcj\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.415282 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-config-data\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.415429 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-internal-tls-certs\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.415536 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-fernet-keys\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.415638 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-credential-keys\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.415664 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-combined-ca-bundle\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.415684 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-public-tls-certs\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.415722 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-scripts\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.517459 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-credential-keys\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.517749 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-public-tls-certs\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.517840 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-combined-ca-bundle\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.517962 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-scripts\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.518086 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mmcj\" (UniqueName: \"kubernetes.io/projected/0474bea5-5db3-4b16-a280-9589048721c1-kube-api-access-2mmcj\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.518202 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-config-data\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.518692 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-internal-tls-certs\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.518849 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-fernet-keys\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.522761 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-credential-keys\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.523159 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-scripts\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.523423 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-combined-ca-bundle\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.530806 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-public-tls-certs\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.530941 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-config-data\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.530925 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-internal-tls-certs\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.535813 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-fernet-keys\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.537501 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mmcj\" (UniqueName: \"kubernetes.io/projected/0474bea5-5db3-4b16-a280-9589048721c1-kube-api-access-2mmcj\") pod \"keystone-6ddbdbb77d-2mgh6\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:37 crc kubenswrapper[4744]: I1205 20:32:37.615748 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:38 crc kubenswrapper[4744]: I1205 20:32:38.091915 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6"] Dec 05 20:32:38 crc kubenswrapper[4744]: I1205 20:32:38.211667 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" event={"ID":"0474bea5-5db3-4b16-a280-9589048721c1","Type":"ContainerStarted","Data":"675f9cdff4adaf73cf7bda6f8bcb043fa2f456dc8f97d909a44c342d3578d5d7"} Dec 05 20:32:39 crc kubenswrapper[4744]: I1205 20:32:39.222053 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" event={"ID":"0474bea5-5db3-4b16-a280-9589048721c1","Type":"ContainerStarted","Data":"c8c8ab4fb5df1a29dc62ce500c5591c1c3c2bc5c26ce0b93df59e3b4e025ce6a"} Dec 05 20:32:39 crc kubenswrapper[4744]: I1205 20:32:39.222435 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:32:39 crc kubenswrapper[4744]: I1205 20:32:39.245591 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" podStartSLOduration=2.245577955 podStartE2EDuration="2.245577955s" podCreationTimestamp="2025-12-05 20:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:32:39.243893094 +0000 UTC m=+1329.473704452" watchObservedRunningTime="2025-12-05 20:32:39.245577955 +0000 UTC m=+1329.475389323" Dec 05 20:32:52 crc kubenswrapper[4744]: E1205 20:32:52.573274 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Dec 05 20:32:52 crc kubenswrapper[4744]: E1205 20:32:52.574066 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5qbgv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_watcher-kuttl-default(70b7fe68-7fe4-40ed-878c-f07405b97069): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 20:32:52 crc kubenswrapper[4744]: E1205 20:32:52.575341 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="watcher-kuttl-default/ceilometer-0" podUID="70b7fe68-7fe4-40ed-878c-f07405b97069" Dec 05 20:32:53 crc kubenswrapper[4744]: I1205 20:32:53.323589 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="70b7fe68-7fe4-40ed-878c-f07405b97069" containerName="ceilometer-central-agent" containerID="cri-o://cf754adf40cb03ade5417a95909955bac3c042fa7f36551489859a2f6a3c6733" gracePeriod=30 Dec 05 20:32:53 crc kubenswrapper[4744]: I1205 20:32:53.323668 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="70b7fe68-7fe4-40ed-878c-f07405b97069" containerName="ceilometer-notification-agent" containerID="cri-o://742c146e9cd7a5703b7573153e10aa089db73befa020a857a311d6bbdd4769eb" gracePeriod=30 Dec 05 20:32:53 crc kubenswrapper[4744]: I1205 20:32:53.323669 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="70b7fe68-7fe4-40ed-878c-f07405b97069" containerName="sg-core" containerID="cri-o://91c7a0a49e5bc5960fa1099f09e59bac227d427432166992dd9e067673de675f" gracePeriod=30 Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.333707 4744 generic.go:334] "Generic (PLEG): container finished" podID="70b7fe68-7fe4-40ed-878c-f07405b97069" containerID="91c7a0a49e5bc5960fa1099f09e59bac227d427432166992dd9e067673de675f" exitCode=2 Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.333740 4744 generic.go:334] "Generic (PLEG): container finished" podID="70b7fe68-7fe4-40ed-878c-f07405b97069" containerID="cf754adf40cb03ade5417a95909955bac3c042fa7f36551489859a2f6a3c6733" exitCode=0 Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.333768 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70b7fe68-7fe4-40ed-878c-f07405b97069","Type":"ContainerDied","Data":"91c7a0a49e5bc5960fa1099f09e59bac227d427432166992dd9e067673de675f"} Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.333820 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70b7fe68-7fe4-40ed-878c-f07405b97069","Type":"ContainerDied","Data":"cf754adf40cb03ade5417a95909955bac3c042fa7f36551489859a2f6a3c6733"} Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.704226 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.799058 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-combined-ca-bundle\") pod \"70b7fe68-7fe4-40ed-878c-f07405b97069\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.799184 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70b7fe68-7fe4-40ed-878c-f07405b97069-run-httpd\") pod \"70b7fe68-7fe4-40ed-878c-f07405b97069\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.799239 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70b7fe68-7fe4-40ed-878c-f07405b97069-log-httpd\") pod \"70b7fe68-7fe4-40ed-878c-f07405b97069\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.799264 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qbgv\" (UniqueName: \"kubernetes.io/projected/70b7fe68-7fe4-40ed-878c-f07405b97069-kube-api-access-5qbgv\") pod \"70b7fe68-7fe4-40ed-878c-f07405b97069\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.799308 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-scripts\") pod \"70b7fe68-7fe4-40ed-878c-f07405b97069\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.799339 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-config-data\") pod \"70b7fe68-7fe4-40ed-878c-f07405b97069\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.799375 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-sg-core-conf-yaml\") pod \"70b7fe68-7fe4-40ed-878c-f07405b97069\" (UID: \"70b7fe68-7fe4-40ed-878c-f07405b97069\") " Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.799675 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70b7fe68-7fe4-40ed-878c-f07405b97069-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "70b7fe68-7fe4-40ed-878c-f07405b97069" (UID: "70b7fe68-7fe4-40ed-878c-f07405b97069"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.799686 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70b7fe68-7fe4-40ed-878c-f07405b97069-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "70b7fe68-7fe4-40ed-878c-f07405b97069" (UID: "70b7fe68-7fe4-40ed-878c-f07405b97069"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.804373 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70b7fe68-7fe4-40ed-878c-f07405b97069-kube-api-access-5qbgv" (OuterVolumeSpecName: "kube-api-access-5qbgv") pod "70b7fe68-7fe4-40ed-878c-f07405b97069" (UID: "70b7fe68-7fe4-40ed-878c-f07405b97069"). InnerVolumeSpecName "kube-api-access-5qbgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.804421 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-scripts" (OuterVolumeSpecName: "scripts") pod "70b7fe68-7fe4-40ed-878c-f07405b97069" (UID: "70b7fe68-7fe4-40ed-878c-f07405b97069"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.828985 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "70b7fe68-7fe4-40ed-878c-f07405b97069" (UID: "70b7fe68-7fe4-40ed-878c-f07405b97069"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.845554 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-config-data" (OuterVolumeSpecName: "config-data") pod "70b7fe68-7fe4-40ed-878c-f07405b97069" (UID: "70b7fe68-7fe4-40ed-878c-f07405b97069"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.860193 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70b7fe68-7fe4-40ed-878c-f07405b97069" (UID: "70b7fe68-7fe4-40ed-878c-f07405b97069"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.901176 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.901214 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70b7fe68-7fe4-40ed-878c-f07405b97069-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.901226 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70b7fe68-7fe4-40ed-878c-f07405b97069-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.901239 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qbgv\" (UniqueName: \"kubernetes.io/projected/70b7fe68-7fe4-40ed-878c-f07405b97069-kube-api-access-5qbgv\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.901251 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.901265 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:54 crc kubenswrapper[4744]: I1205 20:32:54.901278 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70b7fe68-7fe4-40ed-878c-f07405b97069-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.343862 4744 generic.go:334] "Generic (PLEG): container finished" podID="70b7fe68-7fe4-40ed-878c-f07405b97069" containerID="742c146e9cd7a5703b7573153e10aa089db73befa020a857a311d6bbdd4769eb" exitCode=0 Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.343924 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.343935 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70b7fe68-7fe4-40ed-878c-f07405b97069","Type":"ContainerDied","Data":"742c146e9cd7a5703b7573153e10aa089db73befa020a857a311d6bbdd4769eb"} Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.343970 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"70b7fe68-7fe4-40ed-878c-f07405b97069","Type":"ContainerDied","Data":"65dec2fe5a00cfe2b9996274dfa404a0c541b8fbef3823a1be4c667c26b30ef2"} Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.343994 4744 scope.go:117] "RemoveContainer" containerID="91c7a0a49e5bc5960fa1099f09e59bac227d427432166992dd9e067673de675f" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.367116 4744 scope.go:117] "RemoveContainer" containerID="742c146e9cd7a5703b7573153e10aa089db73befa020a857a311d6bbdd4769eb" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.404691 4744 scope.go:117] "RemoveContainer" containerID="cf754adf40cb03ade5417a95909955bac3c042fa7f36551489859a2f6a3c6733" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.408365 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.441557 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.450865 4744 scope.go:117] "RemoveContainer" containerID="91c7a0a49e5bc5960fa1099f09e59bac227d427432166992dd9e067673de675f" Dec 05 20:32:55 crc kubenswrapper[4744]: E1205 20:32:55.454343 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91c7a0a49e5bc5960fa1099f09e59bac227d427432166992dd9e067673de675f\": container with ID starting with 91c7a0a49e5bc5960fa1099f09e59bac227d427432166992dd9e067673de675f not found: ID does not exist" containerID="91c7a0a49e5bc5960fa1099f09e59bac227d427432166992dd9e067673de675f" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.454447 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91c7a0a49e5bc5960fa1099f09e59bac227d427432166992dd9e067673de675f"} err="failed to get container status \"91c7a0a49e5bc5960fa1099f09e59bac227d427432166992dd9e067673de675f\": rpc error: code = NotFound desc = could not find container \"91c7a0a49e5bc5960fa1099f09e59bac227d427432166992dd9e067673de675f\": container with ID starting with 91c7a0a49e5bc5960fa1099f09e59bac227d427432166992dd9e067673de675f not found: ID does not exist" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.454482 4744 scope.go:117] "RemoveContainer" containerID="742c146e9cd7a5703b7573153e10aa089db73befa020a857a311d6bbdd4769eb" Dec 05 20:32:55 crc kubenswrapper[4744]: E1205 20:32:55.456529 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"742c146e9cd7a5703b7573153e10aa089db73befa020a857a311d6bbdd4769eb\": container with ID starting with 742c146e9cd7a5703b7573153e10aa089db73befa020a857a311d6bbdd4769eb not found: ID does not exist" containerID="742c146e9cd7a5703b7573153e10aa089db73befa020a857a311d6bbdd4769eb" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.456558 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"742c146e9cd7a5703b7573153e10aa089db73befa020a857a311d6bbdd4769eb"} err="failed to get container status \"742c146e9cd7a5703b7573153e10aa089db73befa020a857a311d6bbdd4769eb\": rpc error: code = NotFound desc = could not find container \"742c146e9cd7a5703b7573153e10aa089db73befa020a857a311d6bbdd4769eb\": container with ID starting with 742c146e9cd7a5703b7573153e10aa089db73befa020a857a311d6bbdd4769eb not found: ID does not exist" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.456585 4744 scope.go:117] "RemoveContainer" containerID="cf754adf40cb03ade5417a95909955bac3c042fa7f36551489859a2f6a3c6733" Dec 05 20:32:55 crc kubenswrapper[4744]: E1205 20:32:55.456974 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf754adf40cb03ade5417a95909955bac3c042fa7f36551489859a2f6a3c6733\": container with ID starting with cf754adf40cb03ade5417a95909955bac3c042fa7f36551489859a2f6a3c6733 not found: ID does not exist" containerID="cf754adf40cb03ade5417a95909955bac3c042fa7f36551489859a2f6a3c6733" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.457025 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf754adf40cb03ade5417a95909955bac3c042fa7f36551489859a2f6a3c6733"} err="failed to get container status \"cf754adf40cb03ade5417a95909955bac3c042fa7f36551489859a2f6a3c6733\": rpc error: code = NotFound desc = could not find container \"cf754adf40cb03ade5417a95909955bac3c042fa7f36551489859a2f6a3c6733\": container with ID starting with cf754adf40cb03ade5417a95909955bac3c042fa7f36551489859a2f6a3c6733 not found: ID does not exist" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.457179 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:32:55 crc kubenswrapper[4744]: E1205 20:32:55.458679 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70b7fe68-7fe4-40ed-878c-f07405b97069" containerName="ceilometer-central-agent" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.458709 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b7fe68-7fe4-40ed-878c-f07405b97069" containerName="ceilometer-central-agent" Dec 05 20:32:55 crc kubenswrapper[4744]: E1205 20:32:55.458732 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70b7fe68-7fe4-40ed-878c-f07405b97069" containerName="ceilometer-notification-agent" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.458740 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b7fe68-7fe4-40ed-878c-f07405b97069" containerName="ceilometer-notification-agent" Dec 05 20:32:55 crc kubenswrapper[4744]: E1205 20:32:55.458765 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70b7fe68-7fe4-40ed-878c-f07405b97069" containerName="sg-core" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.458774 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b7fe68-7fe4-40ed-878c-f07405b97069" containerName="sg-core" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.459445 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="70b7fe68-7fe4-40ed-878c-f07405b97069" containerName="ceilometer-central-agent" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.459497 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="70b7fe68-7fe4-40ed-878c-f07405b97069" containerName="sg-core" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.459511 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="70b7fe68-7fe4-40ed-878c-f07405b97069" containerName="ceilometer-notification-agent" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.464993 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.467066 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.467509 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.471388 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.511012 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5062a9ae-9470-4ae2-b9ff-578b2380a723-log-httpd\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.511262 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-config-data\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.511392 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.511523 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-scripts\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.511608 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg7gz\" (UniqueName: \"kubernetes.io/projected/5062a9ae-9470-4ae2-b9ff-578b2380a723-kube-api-access-rg7gz\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.511694 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5062a9ae-9470-4ae2-b9ff-578b2380a723-run-httpd\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.511926 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.612830 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-scripts\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.612879 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg7gz\" (UniqueName: \"kubernetes.io/projected/5062a9ae-9470-4ae2-b9ff-578b2380a723-kube-api-access-rg7gz\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.612900 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5062a9ae-9470-4ae2-b9ff-578b2380a723-run-httpd\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.612952 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.612987 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5062a9ae-9470-4ae2-b9ff-578b2380a723-log-httpd\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.613023 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-config-data\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.613037 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.614106 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5062a9ae-9470-4ae2-b9ff-578b2380a723-log-httpd\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.614137 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5062a9ae-9470-4ae2-b9ff-578b2380a723-run-httpd\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.616986 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-scripts\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.617077 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.617481 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-config-data\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.617606 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.630428 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg7gz\" (UniqueName: \"kubernetes.io/projected/5062a9ae-9470-4ae2-b9ff-578b2380a723-kube-api-access-rg7gz\") pod \"ceilometer-0\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:55 crc kubenswrapper[4744]: I1205 20:32:55.782484 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:32:56 crc kubenswrapper[4744]: I1205 20:32:56.092888 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70b7fe68-7fe4-40ed-878c-f07405b97069" path="/var/lib/kubelet/pods/70b7fe68-7fe4-40ed-878c-f07405b97069/volumes" Dec 05 20:32:56 crc kubenswrapper[4744]: I1205 20:32:56.248283 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:32:56 crc kubenswrapper[4744]: I1205 20:32:56.260821 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:32:56 crc kubenswrapper[4744]: I1205 20:32:56.354619 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5062a9ae-9470-4ae2-b9ff-578b2380a723","Type":"ContainerStarted","Data":"b81de6449a351666655a49f3eed62385c5d64835ebcf612eb7e7209112b5633d"} Dec 05 20:32:57 crc kubenswrapper[4744]: I1205 20:32:57.364416 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5062a9ae-9470-4ae2-b9ff-578b2380a723","Type":"ContainerStarted","Data":"7fd972dece50ec445a276426f3ff87191707308111164d128dcc0487636eef48"} Dec 05 20:33:00 crc kubenswrapper[4744]: I1205 20:33:00.396098 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5062a9ae-9470-4ae2-b9ff-578b2380a723","Type":"ContainerStarted","Data":"34ba9d8ba2009a5bf48d6d36dd72385f233a112ba78f5c9bbe22c9ba7891014e"} Dec 05 20:33:00 crc kubenswrapper[4744]: I1205 20:33:00.396964 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5062a9ae-9470-4ae2-b9ff-578b2380a723","Type":"ContainerStarted","Data":"f0aa525ff9d786ec806bc5fe8fe7221c84b35cb3f719b5ec702810742af9670c"} Dec 05 20:33:02 crc kubenswrapper[4744]: I1205 20:33:02.424859 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5062a9ae-9470-4ae2-b9ff-578b2380a723","Type":"ContainerStarted","Data":"62a3d9f2676fa459a3197cb62a233700f1071c33bf38dc797792607b3ee2da8b"} Dec 05 20:33:02 crc kubenswrapper[4744]: I1205 20:33:02.426829 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:02 crc kubenswrapper[4744]: I1205 20:33:02.459780 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.18690285 podStartE2EDuration="7.459759026s" podCreationTimestamp="2025-12-05 20:32:55 +0000 UTC" firstStartedPulling="2025-12-05 20:32:56.2605093 +0000 UTC m=+1346.490320678" lastFinishedPulling="2025-12-05 20:33:01.533365486 +0000 UTC m=+1351.763176854" observedRunningTime="2025-12-05 20:33:02.450787746 +0000 UTC m=+1352.680599174" watchObservedRunningTime="2025-12-05 20:33:02.459759026 +0000 UTC m=+1352.689570384" Dec 05 20:33:09 crc kubenswrapper[4744]: I1205 20:33:09.241055 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:33:10 crc kubenswrapper[4744]: I1205 20:33:10.820556 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/openstackclient"] Dec 05 20:33:10 crc kubenswrapper[4744]: I1205 20:33:10.821737 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Dec 05 20:33:10 crc kubenswrapper[4744]: I1205 20:33:10.825050 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"openstack-config-secret" Dec 05 20:33:10 crc kubenswrapper[4744]: I1205 20:33:10.825523 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-config" Dec 05 20:33:10 crc kubenswrapper[4744]: I1205 20:33:10.825814 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"openstackclient-openstackclient-dockercfg-lb6lc" Dec 05 20:33:10 crc kubenswrapper[4744]: I1205 20:33:10.831227 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstackclient"] Dec 05 20:33:10 crc kubenswrapper[4744]: I1205 20:33:10.968008 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b9978ca7-d572-45a0-8b3a-94a3eef5e1b2-openstack-config-secret\") pod \"openstackclient\" (UID: \"b9978ca7-d572-45a0-8b3a-94a3eef5e1b2\") " pod="watcher-kuttl-default/openstackclient" Dec 05 20:33:10 crc kubenswrapper[4744]: I1205 20:33:10.968605 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9978ca7-d572-45a0-8b3a-94a3eef5e1b2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b9978ca7-d572-45a0-8b3a-94a3eef5e1b2\") " pod="watcher-kuttl-default/openstackclient" Dec 05 20:33:10 crc kubenswrapper[4744]: I1205 20:33:10.968770 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjrtr\" (UniqueName: \"kubernetes.io/projected/b9978ca7-d572-45a0-8b3a-94a3eef5e1b2-kube-api-access-wjrtr\") pod \"openstackclient\" (UID: \"b9978ca7-d572-45a0-8b3a-94a3eef5e1b2\") " pod="watcher-kuttl-default/openstackclient" Dec 05 20:33:10 crc kubenswrapper[4744]: I1205 20:33:10.968880 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b9978ca7-d572-45a0-8b3a-94a3eef5e1b2-openstack-config\") pod \"openstackclient\" (UID: \"b9978ca7-d572-45a0-8b3a-94a3eef5e1b2\") " pod="watcher-kuttl-default/openstackclient" Dec 05 20:33:11 crc kubenswrapper[4744]: I1205 20:33:11.069624 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b9978ca7-d572-45a0-8b3a-94a3eef5e1b2-openstack-config-secret\") pod \"openstackclient\" (UID: \"b9978ca7-d572-45a0-8b3a-94a3eef5e1b2\") " pod="watcher-kuttl-default/openstackclient" Dec 05 20:33:11 crc kubenswrapper[4744]: I1205 20:33:11.069665 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9978ca7-d572-45a0-8b3a-94a3eef5e1b2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b9978ca7-d572-45a0-8b3a-94a3eef5e1b2\") " pod="watcher-kuttl-default/openstackclient" Dec 05 20:33:11 crc kubenswrapper[4744]: I1205 20:33:11.069704 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjrtr\" (UniqueName: \"kubernetes.io/projected/b9978ca7-d572-45a0-8b3a-94a3eef5e1b2-kube-api-access-wjrtr\") pod \"openstackclient\" (UID: \"b9978ca7-d572-45a0-8b3a-94a3eef5e1b2\") " pod="watcher-kuttl-default/openstackclient" Dec 05 20:33:11 crc kubenswrapper[4744]: I1205 20:33:11.069731 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b9978ca7-d572-45a0-8b3a-94a3eef5e1b2-openstack-config\") pod \"openstackclient\" (UID: \"b9978ca7-d572-45a0-8b3a-94a3eef5e1b2\") " pod="watcher-kuttl-default/openstackclient" Dec 05 20:33:11 crc kubenswrapper[4744]: I1205 20:33:11.070653 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b9978ca7-d572-45a0-8b3a-94a3eef5e1b2-openstack-config\") pod \"openstackclient\" (UID: \"b9978ca7-d572-45a0-8b3a-94a3eef5e1b2\") " pod="watcher-kuttl-default/openstackclient" Dec 05 20:33:11 crc kubenswrapper[4744]: I1205 20:33:11.075269 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b9978ca7-d572-45a0-8b3a-94a3eef5e1b2-openstack-config-secret\") pod \"openstackclient\" (UID: \"b9978ca7-d572-45a0-8b3a-94a3eef5e1b2\") " pod="watcher-kuttl-default/openstackclient" Dec 05 20:33:11 crc kubenswrapper[4744]: I1205 20:33:11.075915 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9978ca7-d572-45a0-8b3a-94a3eef5e1b2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b9978ca7-d572-45a0-8b3a-94a3eef5e1b2\") " pod="watcher-kuttl-default/openstackclient" Dec 05 20:33:11 crc kubenswrapper[4744]: I1205 20:33:11.097893 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjrtr\" (UniqueName: \"kubernetes.io/projected/b9978ca7-d572-45a0-8b3a-94a3eef5e1b2-kube-api-access-wjrtr\") pod \"openstackclient\" (UID: \"b9978ca7-d572-45a0-8b3a-94a3eef5e1b2\") " pod="watcher-kuttl-default/openstackclient" Dec 05 20:33:11 crc kubenswrapper[4744]: I1205 20:33:11.162788 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Dec 05 20:33:12 crc kubenswrapper[4744]: I1205 20:33:12.069038 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstackclient"] Dec 05 20:33:12 crc kubenswrapper[4744]: I1205 20:33:12.521598 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstackclient" event={"ID":"b9978ca7-d572-45a0-8b3a-94a3eef5e1b2","Type":"ContainerStarted","Data":"9ee2943d8e430fd57df9edc220feb9e3047c8288c99a61c200bd225bbe25c51f"} Dec 05 20:33:22 crc kubenswrapper[4744]: I1205 20:33:22.620475 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstackclient" event={"ID":"b9978ca7-d572-45a0-8b3a-94a3eef5e1b2","Type":"ContainerStarted","Data":"615be3ad5e68f2a5b9c9f8dd6883d776022c6c578d55801decdedb11b4fadb87"} Dec 05 20:33:22 crc kubenswrapper[4744]: I1205 20:33:22.645487 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/openstackclient" podStartSLOduration=2.503416268 podStartE2EDuration="12.645456779s" podCreationTimestamp="2025-12-05 20:33:10 +0000 UTC" firstStartedPulling="2025-12-05 20:33:12.075346822 +0000 UTC m=+1362.305158190" lastFinishedPulling="2025-12-05 20:33:22.217387293 +0000 UTC m=+1372.447198701" observedRunningTime="2025-12-05 20:33:22.639013588 +0000 UTC m=+1372.868824986" watchObservedRunningTime="2025-12-05 20:33:22.645456779 +0000 UTC m=+1372.875268187" Dec 05 20:33:25 crc kubenswrapper[4744]: I1205 20:33:25.788254 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:28 crc kubenswrapper[4744]: I1205 20:33:28.657173 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 20:33:28 crc kubenswrapper[4744]: I1205 20:33:28.657612 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/kube-state-metrics-0" podUID="5b28c681-e337-45a6-b41f-37ce1c0cc03b" containerName="kube-state-metrics" containerID="cri-o://b9912a51a92948c0e12e41fd57b66a7f508fc0859b95e9f15a71830b37b67ad1" gracePeriod=30 Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.127258 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.298660 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drmlg\" (UniqueName: \"kubernetes.io/projected/5b28c681-e337-45a6-b41f-37ce1c0cc03b-kube-api-access-drmlg\") pod \"5b28c681-e337-45a6-b41f-37ce1c0cc03b\" (UID: \"5b28c681-e337-45a6-b41f-37ce1c0cc03b\") " Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.304240 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b28c681-e337-45a6-b41f-37ce1c0cc03b-kube-api-access-drmlg" (OuterVolumeSpecName: "kube-api-access-drmlg") pod "5b28c681-e337-45a6-b41f-37ce1c0cc03b" (UID: "5b28c681-e337-45a6-b41f-37ce1c0cc03b"). InnerVolumeSpecName "kube-api-access-drmlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.400725 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drmlg\" (UniqueName: \"kubernetes.io/projected/5b28c681-e337-45a6-b41f-37ce1c0cc03b-kube-api-access-drmlg\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.679382 4744 generic.go:334] "Generic (PLEG): container finished" podID="5b28c681-e337-45a6-b41f-37ce1c0cc03b" containerID="b9912a51a92948c0e12e41fd57b66a7f508fc0859b95e9f15a71830b37b67ad1" exitCode=2 Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.679425 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"5b28c681-e337-45a6-b41f-37ce1c0cc03b","Type":"ContainerDied","Data":"b9912a51a92948c0e12e41fd57b66a7f508fc0859b95e9f15a71830b37b67ad1"} Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.679469 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.679491 4744 scope.go:117] "RemoveContainer" containerID="b9912a51a92948c0e12e41fd57b66a7f508fc0859b95e9f15a71830b37b67ad1" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.679478 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"5b28c681-e337-45a6-b41f-37ce1c0cc03b","Type":"ContainerDied","Data":"00f761df3a4d7b6f59687ddfd6fadf79018640b87d88b16d40db2f5615e824a3"} Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.711650 4744 scope.go:117] "RemoveContainer" containerID="b9912a51a92948c0e12e41fd57b66a7f508fc0859b95e9f15a71830b37b67ad1" Dec 05 20:33:29 crc kubenswrapper[4744]: E1205 20:33:29.712587 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9912a51a92948c0e12e41fd57b66a7f508fc0859b95e9f15a71830b37b67ad1\": container with ID starting with b9912a51a92948c0e12e41fd57b66a7f508fc0859b95e9f15a71830b37b67ad1 not found: ID does not exist" containerID="b9912a51a92948c0e12e41fd57b66a7f508fc0859b95e9f15a71830b37b67ad1" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.712630 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9912a51a92948c0e12e41fd57b66a7f508fc0859b95e9f15a71830b37b67ad1"} err="failed to get container status \"b9912a51a92948c0e12e41fd57b66a7f508fc0859b95e9f15a71830b37b67ad1\": rpc error: code = NotFound desc = could not find container \"b9912a51a92948c0e12e41fd57b66a7f508fc0859b95e9f15a71830b37b67ad1\": container with ID starting with b9912a51a92948c0e12e41fd57b66a7f508fc0859b95e9f15a71830b37b67ad1 not found: ID does not exist" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.714964 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.723804 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.746885 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 20:33:29 crc kubenswrapper[4744]: E1205 20:33:29.747333 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b28c681-e337-45a6-b41f-37ce1c0cc03b" containerName="kube-state-metrics" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.747353 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b28c681-e337-45a6-b41f-37ce1c0cc03b" containerName="kube-state-metrics" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.747544 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b28c681-e337-45a6-b41f-37ce1c0cc03b" containerName="kube-state-metrics" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.750153 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.752704 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"kube-state-metrics-tls-config" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.752979 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-kube-state-metrics-svc" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.753740 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.825725 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.825996 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="ceilometer-central-agent" containerID="cri-o://7fd972dece50ec445a276426f3ff87191707308111164d128dcc0487636eef48" gracePeriod=30 Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.826442 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="proxy-httpd" containerID="cri-o://62a3d9f2676fa459a3197cb62a233700f1071c33bf38dc797792607b3ee2da8b" gracePeriod=30 Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.826510 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="sg-core" containerID="cri-o://34ba9d8ba2009a5bf48d6d36dd72385f233a112ba78f5c9bbe22c9ba7891014e" gracePeriod=30 Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.826552 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="ceilometer-notification-agent" containerID="cri-o://f0aa525ff9d786ec806bc5fe8fe7221c84b35cb3f719b5ec702810742af9670c" gracePeriod=30 Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.908485 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h4hm\" (UniqueName: \"kubernetes.io/projected/1b1577b0-93ce-41d3-9c87-6009a42d525a-kube-api-access-7h4hm\") pod \"kube-state-metrics-0\" (UID: \"1b1577b0-93ce-41d3-9c87-6009a42d525a\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.908561 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b1577b0-93ce-41d3-9c87-6009a42d525a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1b1577b0-93ce-41d3-9c87-6009a42d525a\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.908588 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b1577b0-93ce-41d3-9c87-6009a42d525a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1b1577b0-93ce-41d3-9c87-6009a42d525a\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:29 crc kubenswrapper[4744]: I1205 20:33:29.908829 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1b1577b0-93ce-41d3-9c87-6009a42d525a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1b1577b0-93ce-41d3-9c87-6009a42d525a\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.010007 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1b1577b0-93ce-41d3-9c87-6009a42d525a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1b1577b0-93ce-41d3-9c87-6009a42d525a\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.010096 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h4hm\" (UniqueName: \"kubernetes.io/projected/1b1577b0-93ce-41d3-9c87-6009a42d525a-kube-api-access-7h4hm\") pod \"kube-state-metrics-0\" (UID: \"1b1577b0-93ce-41d3-9c87-6009a42d525a\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.010148 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b1577b0-93ce-41d3-9c87-6009a42d525a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1b1577b0-93ce-41d3-9c87-6009a42d525a\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.010168 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b1577b0-93ce-41d3-9c87-6009a42d525a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1b1577b0-93ce-41d3-9c87-6009a42d525a\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.014114 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b1577b0-93ce-41d3-9c87-6009a42d525a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1b1577b0-93ce-41d3-9c87-6009a42d525a\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.018375 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1b1577b0-93ce-41d3-9c87-6009a42d525a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1b1577b0-93ce-41d3-9c87-6009a42d525a\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.018897 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b1577b0-93ce-41d3-9c87-6009a42d525a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1b1577b0-93ce-41d3-9c87-6009a42d525a\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.038189 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h4hm\" (UniqueName: \"kubernetes.io/projected/1b1577b0-93ce-41d3-9c87-6009a42d525a-kube-api-access-7h4hm\") pod \"kube-state-metrics-0\" (UID: \"1b1577b0-93ce-41d3-9c87-6009a42d525a\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.074460 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.098352 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b28c681-e337-45a6-b41f-37ce1c0cc03b" path="/var/lib/kubelet/pods/5b28c681-e337-45a6-b41f-37ce1c0cc03b/volumes" Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.434010 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 05 20:33:30 crc kubenswrapper[4744]: W1205 20:33:30.470189 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b1577b0_93ce_41d3_9c87_6009a42d525a.slice/crio-67268d928d5f4aa1e16ed3d6265f869829226bf23e6196ccac38145978f5160c WatchSource:0}: Error finding container 67268d928d5f4aa1e16ed3d6265f869829226bf23e6196ccac38145978f5160c: Status 404 returned error can't find the container with id 67268d928d5f4aa1e16ed3d6265f869829226bf23e6196ccac38145978f5160c Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.708736 4744 generic.go:334] "Generic (PLEG): container finished" podID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerID="62a3d9f2676fa459a3197cb62a233700f1071c33bf38dc797792607b3ee2da8b" exitCode=0 Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.709202 4744 generic.go:334] "Generic (PLEG): container finished" podID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerID="34ba9d8ba2009a5bf48d6d36dd72385f233a112ba78f5c9bbe22c9ba7891014e" exitCode=2 Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.709215 4744 generic.go:334] "Generic (PLEG): container finished" podID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerID="f0aa525ff9d786ec806bc5fe8fe7221c84b35cb3f719b5ec702810742af9670c" exitCode=0 Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.709224 4744 generic.go:334] "Generic (PLEG): container finished" podID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerID="7fd972dece50ec445a276426f3ff87191707308111164d128dcc0487636eef48" exitCode=0 Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.708828 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5062a9ae-9470-4ae2-b9ff-578b2380a723","Type":"ContainerDied","Data":"62a3d9f2676fa459a3197cb62a233700f1071c33bf38dc797792607b3ee2da8b"} Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.709353 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5062a9ae-9470-4ae2-b9ff-578b2380a723","Type":"ContainerDied","Data":"34ba9d8ba2009a5bf48d6d36dd72385f233a112ba78f5c9bbe22c9ba7891014e"} Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.709368 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5062a9ae-9470-4ae2-b9ff-578b2380a723","Type":"ContainerDied","Data":"f0aa525ff9d786ec806bc5fe8fe7221c84b35cb3f719b5ec702810742af9670c"} Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.709380 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5062a9ae-9470-4ae2-b9ff-578b2380a723","Type":"ContainerDied","Data":"7fd972dece50ec445a276426f3ff87191707308111164d128dcc0487636eef48"} Dec 05 20:33:30 crc kubenswrapper[4744]: I1205 20:33:30.711232 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"1b1577b0-93ce-41d3-9c87-6009a42d525a","Type":"ContainerStarted","Data":"67268d928d5f4aa1e16ed3d6265f869829226bf23e6196ccac38145978f5160c"} Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.159225 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.341682 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-scripts\") pod \"5062a9ae-9470-4ae2-b9ff-578b2380a723\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.342058 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5062a9ae-9470-4ae2-b9ff-578b2380a723-run-httpd\") pod \"5062a9ae-9470-4ae2-b9ff-578b2380a723\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.342211 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-sg-core-conf-yaml\") pod \"5062a9ae-9470-4ae2-b9ff-578b2380a723\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.342340 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-config-data\") pod \"5062a9ae-9470-4ae2-b9ff-578b2380a723\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.342474 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5062a9ae-9470-4ae2-b9ff-578b2380a723-log-httpd\") pod \"5062a9ae-9470-4ae2-b9ff-578b2380a723\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.342602 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-combined-ca-bundle\") pod \"5062a9ae-9470-4ae2-b9ff-578b2380a723\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.342723 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg7gz\" (UniqueName: \"kubernetes.io/projected/5062a9ae-9470-4ae2-b9ff-578b2380a723-kube-api-access-rg7gz\") pod \"5062a9ae-9470-4ae2-b9ff-578b2380a723\" (UID: \"5062a9ae-9470-4ae2-b9ff-578b2380a723\") " Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.342914 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5062a9ae-9470-4ae2-b9ff-578b2380a723-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5062a9ae-9470-4ae2-b9ff-578b2380a723" (UID: "5062a9ae-9470-4ae2-b9ff-578b2380a723"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.343252 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5062a9ae-9470-4ae2-b9ff-578b2380a723-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.343229 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5062a9ae-9470-4ae2-b9ff-578b2380a723-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5062a9ae-9470-4ae2-b9ff-578b2380a723" (UID: "5062a9ae-9470-4ae2-b9ff-578b2380a723"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.346441 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-scripts" (OuterVolumeSpecName: "scripts") pod "5062a9ae-9470-4ae2-b9ff-578b2380a723" (UID: "5062a9ae-9470-4ae2-b9ff-578b2380a723"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.346891 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5062a9ae-9470-4ae2-b9ff-578b2380a723-kube-api-access-rg7gz" (OuterVolumeSpecName: "kube-api-access-rg7gz") pod "5062a9ae-9470-4ae2-b9ff-578b2380a723" (UID: "5062a9ae-9470-4ae2-b9ff-578b2380a723"). InnerVolumeSpecName "kube-api-access-rg7gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.388855 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5062a9ae-9470-4ae2-b9ff-578b2380a723" (UID: "5062a9ae-9470-4ae2-b9ff-578b2380a723"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.420438 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-config-data" (OuterVolumeSpecName: "config-data") pod "5062a9ae-9470-4ae2-b9ff-578b2380a723" (UID: "5062a9ae-9470-4ae2-b9ff-578b2380a723"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.426141 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5062a9ae-9470-4ae2-b9ff-578b2380a723" (UID: "5062a9ae-9470-4ae2-b9ff-578b2380a723"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.444251 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.444286 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5062a9ae-9470-4ae2-b9ff-578b2380a723-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.444315 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.444327 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.444336 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5062a9ae-9470-4ae2-b9ff-578b2380a723-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.444345 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg7gz\" (UniqueName: \"kubernetes.io/projected/5062a9ae-9470-4ae2-b9ff-578b2380a723-kube-api-access-rg7gz\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.726236 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5062a9ae-9470-4ae2-b9ff-578b2380a723","Type":"ContainerDied","Data":"b81de6449a351666655a49f3eed62385c5d64835ebcf612eb7e7209112b5633d"} Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.726270 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.726303 4744 scope.go:117] "RemoveContainer" containerID="62a3d9f2676fa459a3197cb62a233700f1071c33bf38dc797792607b3ee2da8b" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.738191 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"1b1577b0-93ce-41d3-9c87-6009a42d525a","Type":"ContainerStarted","Data":"ba292edebbb4be25a2b0b75cf3f7d41092b36f113c43fa8471cadf6b0afaa2f1"} Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.738514 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.752687 4744 scope.go:117] "RemoveContainer" containerID="34ba9d8ba2009a5bf48d6d36dd72385f233a112ba78f5c9bbe22c9ba7891014e" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.785648 4744 scope.go:117] "RemoveContainer" containerID="f0aa525ff9d786ec806bc5fe8fe7221c84b35cb3f719b5ec702810742af9670c" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.786040 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/kube-state-metrics-0" podStartSLOduration=2.437405901 podStartE2EDuration="2.786031097s" podCreationTimestamp="2025-12-05 20:33:29 +0000 UTC" firstStartedPulling="2025-12-05 20:33:30.472446022 +0000 UTC m=+1380.702257390" lastFinishedPulling="2025-12-05 20:33:30.821071218 +0000 UTC m=+1381.050882586" observedRunningTime="2025-12-05 20:33:31.763627529 +0000 UTC m=+1381.993438907" watchObservedRunningTime="2025-12-05 20:33:31.786031097 +0000 UTC m=+1382.015842465" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.803216 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.813651 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.834484 4744 scope.go:117] "RemoveContainer" containerID="7fd972dece50ec445a276426f3ff87191707308111164d128dcc0487636eef48" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.847801 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:33:31 crc kubenswrapper[4744]: E1205 20:33:31.848226 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="proxy-httpd" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.848248 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="proxy-httpd" Dec 05 20:33:31 crc kubenswrapper[4744]: E1205 20:33:31.848264 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="sg-core" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.848272 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="sg-core" Dec 05 20:33:31 crc kubenswrapper[4744]: E1205 20:33:31.848290 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="ceilometer-notification-agent" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.848314 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="ceilometer-notification-agent" Dec 05 20:33:31 crc kubenswrapper[4744]: E1205 20:33:31.848327 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="ceilometer-central-agent" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.848336 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="ceilometer-central-agent" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.848548 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="ceilometer-central-agent" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.848571 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="sg-core" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.848586 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="proxy-httpd" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.848598 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" containerName="ceilometer-notification-agent" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.850379 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.855687 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.856580 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.861170 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.865621 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.952153 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/770c8bdc-9e6e-450b-a6c9-d579cada809b-log-httpd\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.952454 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-scripts\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.952616 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.952721 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt8pz\" (UniqueName: \"kubernetes.io/projected/770c8bdc-9e6e-450b-a6c9-d579cada809b-kube-api-access-xt8pz\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.952795 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-config-data\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.952860 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.952919 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:31 crc kubenswrapper[4744]: I1205 20:33:31.952997 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/770c8bdc-9e6e-450b-a6c9-d579cada809b-run-httpd\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.054997 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/770c8bdc-9e6e-450b-a6c9-d579cada809b-log-httpd\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.055066 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-scripts\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.055106 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.055160 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt8pz\" (UniqueName: \"kubernetes.io/projected/770c8bdc-9e6e-450b-a6c9-d579cada809b-kube-api-access-xt8pz\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.055180 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-config-data\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.055209 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.055225 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.055256 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/770c8bdc-9e6e-450b-a6c9-d579cada809b-run-httpd\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.056244 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/770c8bdc-9e6e-450b-a6c9-d579cada809b-log-httpd\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.056418 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/770c8bdc-9e6e-450b-a6c9-d579cada809b-run-httpd\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.062129 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.063531 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-config-data\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.064187 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.064364 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-scripts\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.079145 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt8pz\" (UniqueName: \"kubernetes.io/projected/770c8bdc-9e6e-450b-a6c9-d579cada809b-kube-api-access-xt8pz\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.080150 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.095931 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5062a9ae-9470-4ae2-b9ff-578b2380a723" path="/var/lib/kubelet/pods/5062a9ae-9470-4ae2-b9ff-578b2380a723/volumes" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.173020 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.491869 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-5wppb"] Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.493493 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-5wppb" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.501358 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw"] Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.502411 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.504311 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.508535 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-5wppb"] Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.536925 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw"] Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.666471 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntxj2\" (UniqueName: \"kubernetes.io/projected/b3631e57-e315-4db0-b144-49e90d81ce43-kube-api-access-ntxj2\") pod \"watcher-db-create-5wppb\" (UID: \"b3631e57-e315-4db0-b144-49e90d81ce43\") " pod="watcher-kuttl-default/watcher-db-create-5wppb" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.666528 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3631e57-e315-4db0-b144-49e90d81ce43-operator-scripts\") pod \"watcher-db-create-5wppb\" (UID: \"b3631e57-e315-4db0-b144-49e90d81ce43\") " pod="watcher-kuttl-default/watcher-db-create-5wppb" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.666622 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6h9d\" (UniqueName: \"kubernetes.io/projected/0e7c1e46-ec30-458e-8402-8fea6d2ed3b1-kube-api-access-s6h9d\") pod \"watcher-43c1-account-create-update-kvtmw\" (UID: \"0e7c1e46-ec30-458e-8402-8fea6d2ed3b1\") " pod="watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.666644 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e7c1e46-ec30-458e-8402-8fea6d2ed3b1-operator-scripts\") pod \"watcher-43c1-account-create-update-kvtmw\" (UID: \"0e7c1e46-ec30-458e-8402-8fea6d2ed3b1\") " pod="watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.672192 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.749416 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"770c8bdc-9e6e-450b-a6c9-d579cada809b","Type":"ContainerStarted","Data":"3a5f5f05cb0c01943951921bcc963d8a2e47fbf1e6363adb5e0831aa0ff4d834"} Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.768962 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntxj2\" (UniqueName: \"kubernetes.io/projected/b3631e57-e315-4db0-b144-49e90d81ce43-kube-api-access-ntxj2\") pod \"watcher-db-create-5wppb\" (UID: \"b3631e57-e315-4db0-b144-49e90d81ce43\") " pod="watcher-kuttl-default/watcher-db-create-5wppb" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.769017 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3631e57-e315-4db0-b144-49e90d81ce43-operator-scripts\") pod \"watcher-db-create-5wppb\" (UID: \"b3631e57-e315-4db0-b144-49e90d81ce43\") " pod="watcher-kuttl-default/watcher-db-create-5wppb" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.769129 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e7c1e46-ec30-458e-8402-8fea6d2ed3b1-operator-scripts\") pod \"watcher-43c1-account-create-update-kvtmw\" (UID: \"0e7c1e46-ec30-458e-8402-8fea6d2ed3b1\") " pod="watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.769145 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6h9d\" (UniqueName: \"kubernetes.io/projected/0e7c1e46-ec30-458e-8402-8fea6d2ed3b1-kube-api-access-s6h9d\") pod \"watcher-43c1-account-create-update-kvtmw\" (UID: \"0e7c1e46-ec30-458e-8402-8fea6d2ed3b1\") " pod="watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.770870 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3631e57-e315-4db0-b144-49e90d81ce43-operator-scripts\") pod \"watcher-db-create-5wppb\" (UID: \"b3631e57-e315-4db0-b144-49e90d81ce43\") " pod="watcher-kuttl-default/watcher-db-create-5wppb" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.770904 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e7c1e46-ec30-458e-8402-8fea6d2ed3b1-operator-scripts\") pod \"watcher-43c1-account-create-update-kvtmw\" (UID: \"0e7c1e46-ec30-458e-8402-8fea6d2ed3b1\") " pod="watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.788558 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntxj2\" (UniqueName: \"kubernetes.io/projected/b3631e57-e315-4db0-b144-49e90d81ce43-kube-api-access-ntxj2\") pod \"watcher-db-create-5wppb\" (UID: \"b3631e57-e315-4db0-b144-49e90d81ce43\") " pod="watcher-kuttl-default/watcher-db-create-5wppb" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.788660 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6h9d\" (UniqueName: \"kubernetes.io/projected/0e7c1e46-ec30-458e-8402-8fea6d2ed3b1-kube-api-access-s6h9d\") pod \"watcher-43c1-account-create-update-kvtmw\" (UID: \"0e7c1e46-ec30-458e-8402-8fea6d2ed3b1\") " pod="watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.816985 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-5wppb" Dec 05 20:33:32 crc kubenswrapper[4744]: I1205 20:33:32.830497 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw" Dec 05 20:33:33 crc kubenswrapper[4744]: I1205 20:33:33.213595 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-5wppb"] Dec 05 20:33:33 crc kubenswrapper[4744]: I1205 20:33:33.328888 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw"] Dec 05 20:33:33 crc kubenswrapper[4744]: W1205 20:33:33.330641 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e7c1e46_ec30_458e_8402_8fea6d2ed3b1.slice/crio-fbc49cc1ea91a53ea0688345bfb35ecd5ce9006e992e0358fe30d9b9b622a32d WatchSource:0}: Error finding container fbc49cc1ea91a53ea0688345bfb35ecd5ce9006e992e0358fe30d9b9b622a32d: Status 404 returned error can't find the container with id fbc49cc1ea91a53ea0688345bfb35ecd5ce9006e992e0358fe30d9b9b622a32d Dec 05 20:33:33 crc kubenswrapper[4744]: I1205 20:33:33.780869 4744 generic.go:334] "Generic (PLEG): container finished" podID="0e7c1e46-ec30-458e-8402-8fea6d2ed3b1" containerID="0422966fbd41de2219c78d2eba85d6aa76a02a9daa15ad7be299536d6b089aac" exitCode=0 Dec 05 20:33:33 crc kubenswrapper[4744]: I1205 20:33:33.780978 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw" event={"ID":"0e7c1e46-ec30-458e-8402-8fea6d2ed3b1","Type":"ContainerDied","Data":"0422966fbd41de2219c78d2eba85d6aa76a02a9daa15ad7be299536d6b089aac"} Dec 05 20:33:33 crc kubenswrapper[4744]: I1205 20:33:33.781007 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw" event={"ID":"0e7c1e46-ec30-458e-8402-8fea6d2ed3b1","Type":"ContainerStarted","Data":"fbc49cc1ea91a53ea0688345bfb35ecd5ce9006e992e0358fe30d9b9b622a32d"} Dec 05 20:33:33 crc kubenswrapper[4744]: I1205 20:33:33.782677 4744 generic.go:334] "Generic (PLEG): container finished" podID="b3631e57-e315-4db0-b144-49e90d81ce43" containerID="096a072fc31b569bcb2876897ef894f9e2e1a82acb4b7eadb9b29e34be74b7b5" exitCode=0 Dec 05 20:33:33 crc kubenswrapper[4744]: I1205 20:33:33.782855 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-5wppb" event={"ID":"b3631e57-e315-4db0-b144-49e90d81ce43","Type":"ContainerDied","Data":"096a072fc31b569bcb2876897ef894f9e2e1a82acb4b7eadb9b29e34be74b7b5"} Dec 05 20:33:33 crc kubenswrapper[4744]: I1205 20:33:33.782897 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-5wppb" event={"ID":"b3631e57-e315-4db0-b144-49e90d81ce43","Type":"ContainerStarted","Data":"23dfdee47eb741afe19497f2f8d49bfbe25f51984bd5065a8b91922b0baa2b94"} Dec 05 20:33:33 crc kubenswrapper[4744]: I1205 20:33:33.786950 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"770c8bdc-9e6e-450b-a6c9-d579cada809b","Type":"ContainerStarted","Data":"1f08ed016ad9601e7551caa7e36f5e0a2bb6aad562b28fe994bcd75cc65506b8"} Dec 05 20:33:34 crc kubenswrapper[4744]: I1205 20:33:34.795082 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"770c8bdc-9e6e-450b-a6c9-d579cada809b","Type":"ContainerStarted","Data":"b599959ed856ff31b18a3034f0429e63cbc931ade4366d9c691eaf7ae51e3d66"} Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.126743 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw" Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.211983 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6h9d\" (UniqueName: \"kubernetes.io/projected/0e7c1e46-ec30-458e-8402-8fea6d2ed3b1-kube-api-access-s6h9d\") pod \"0e7c1e46-ec30-458e-8402-8fea6d2ed3b1\" (UID: \"0e7c1e46-ec30-458e-8402-8fea6d2ed3b1\") " Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.212077 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e7c1e46-ec30-458e-8402-8fea6d2ed3b1-operator-scripts\") pod \"0e7c1e46-ec30-458e-8402-8fea6d2ed3b1\" (UID: \"0e7c1e46-ec30-458e-8402-8fea6d2ed3b1\") " Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.214100 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e7c1e46-ec30-458e-8402-8fea6d2ed3b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e7c1e46-ec30-458e-8402-8fea6d2ed3b1" (UID: "0e7c1e46-ec30-458e-8402-8fea6d2ed3b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.246154 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e7c1e46-ec30-458e-8402-8fea6d2ed3b1-kube-api-access-s6h9d" (OuterVolumeSpecName: "kube-api-access-s6h9d") pod "0e7c1e46-ec30-458e-8402-8fea6d2ed3b1" (UID: "0e7c1e46-ec30-458e-8402-8fea6d2ed3b1"). InnerVolumeSpecName "kube-api-access-s6h9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.314513 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6h9d\" (UniqueName: \"kubernetes.io/projected/0e7c1e46-ec30-458e-8402-8fea6d2ed3b1-kube-api-access-s6h9d\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.314548 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e7c1e46-ec30-458e-8402-8fea6d2ed3b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.362338 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-5wppb" Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.415962 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3631e57-e315-4db0-b144-49e90d81ce43-operator-scripts\") pod \"b3631e57-e315-4db0-b144-49e90d81ce43\" (UID: \"b3631e57-e315-4db0-b144-49e90d81ce43\") " Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.416034 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntxj2\" (UniqueName: \"kubernetes.io/projected/b3631e57-e315-4db0-b144-49e90d81ce43-kube-api-access-ntxj2\") pod \"b3631e57-e315-4db0-b144-49e90d81ce43\" (UID: \"b3631e57-e315-4db0-b144-49e90d81ce43\") " Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.416503 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3631e57-e315-4db0-b144-49e90d81ce43-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b3631e57-e315-4db0-b144-49e90d81ce43" (UID: "b3631e57-e315-4db0-b144-49e90d81ce43"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.416846 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3631e57-e315-4db0-b144-49e90d81ce43-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.420071 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3631e57-e315-4db0-b144-49e90d81ce43-kube-api-access-ntxj2" (OuterVolumeSpecName: "kube-api-access-ntxj2") pod "b3631e57-e315-4db0-b144-49e90d81ce43" (UID: "b3631e57-e315-4db0-b144-49e90d81ce43"). InnerVolumeSpecName "kube-api-access-ntxj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.518042 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntxj2\" (UniqueName: \"kubernetes.io/projected/b3631e57-e315-4db0-b144-49e90d81ce43-kube-api-access-ntxj2\") on node \"crc\" DevicePath \"\"" Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.806387 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw" event={"ID":"0e7c1e46-ec30-458e-8402-8fea6d2ed3b1","Type":"ContainerDied","Data":"fbc49cc1ea91a53ea0688345bfb35ecd5ce9006e992e0358fe30d9b9b622a32d"} Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.806485 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbc49cc1ea91a53ea0688345bfb35ecd5ce9006e992e0358fe30d9b9b622a32d" Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.806438 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw" Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.808629 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-5wppb" event={"ID":"b3631e57-e315-4db0-b144-49e90d81ce43","Type":"ContainerDied","Data":"23dfdee47eb741afe19497f2f8d49bfbe25f51984bd5065a8b91922b0baa2b94"} Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.808664 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23dfdee47eb741afe19497f2f8d49bfbe25f51984bd5065a8b91922b0baa2b94" Dec 05 20:33:35 crc kubenswrapper[4744]: I1205 20:33:35.808677 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-5wppb" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.751087 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-992h9"] Dec 05 20:33:37 crc kubenswrapper[4744]: E1205 20:33:37.751951 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3631e57-e315-4db0-b144-49e90d81ce43" containerName="mariadb-database-create" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.751966 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3631e57-e315-4db0-b144-49e90d81ce43" containerName="mariadb-database-create" Dec 05 20:33:37 crc kubenswrapper[4744]: E1205 20:33:37.751984 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7c1e46-ec30-458e-8402-8fea6d2ed3b1" containerName="mariadb-account-create-update" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.751992 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7c1e46-ec30-458e-8402-8fea6d2ed3b1" containerName="mariadb-account-create-update" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.752176 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3631e57-e315-4db0-b144-49e90d81ce43" containerName="mariadb-database-create" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.752199 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7c1e46-ec30-458e-8402-8fea6d2ed3b1" containerName="mariadb-account-create-update" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.752848 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.755249 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-6fw64" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.757597 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.767822 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-992h9"] Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.828756 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"770c8bdc-9e6e-450b-a6c9-d579cada809b","Type":"ContainerStarted","Data":"2cd0c79b6cc1c0c63d5ceeb4bb0a03f02155f46079799783f7d3bf06d69d600d"} Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.863979 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd8rv\" (UniqueName: \"kubernetes.io/projected/a42ec6d7-1366-4466-b366-b1de21d62ef2-kube-api-access-hd8rv\") pod \"watcher-kuttl-db-sync-992h9\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.864060 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-config-data\") pod \"watcher-kuttl-db-sync-992h9\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.864139 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-992h9\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.864323 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-db-sync-config-data\") pod \"watcher-kuttl-db-sync-992h9\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.965020 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-config-data\") pod \"watcher-kuttl-db-sync-992h9\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.965127 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-992h9\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.965213 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-db-sync-config-data\") pod \"watcher-kuttl-db-sync-992h9\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.965284 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd8rv\" (UniqueName: \"kubernetes.io/projected/a42ec6d7-1366-4466-b366-b1de21d62ef2-kube-api-access-hd8rv\") pod \"watcher-kuttl-db-sync-992h9\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.972239 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-992h9\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.972344 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-config-data\") pod \"watcher-kuttl-db-sync-992h9\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:33:37 crc kubenswrapper[4744]: I1205 20:33:37.972983 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-db-sync-config-data\") pod \"watcher-kuttl-db-sync-992h9\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:33:38 crc kubenswrapper[4744]: I1205 20:33:38.003945 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd8rv\" (UniqueName: \"kubernetes.io/projected/a42ec6d7-1366-4466-b366-b1de21d62ef2-kube-api-access-hd8rv\") pod \"watcher-kuttl-db-sync-992h9\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:33:38 crc kubenswrapper[4744]: I1205 20:33:38.078261 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:33:38 crc kubenswrapper[4744]: I1205 20:33:38.562663 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-992h9"] Dec 05 20:33:38 crc kubenswrapper[4744]: W1205 20:33:38.577069 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda42ec6d7_1366_4466_b366_b1de21d62ef2.slice/crio-c97a62b26e75d3fc1ebe0d57ef35928c11070cd7da007c7a41d134d99d10df55 WatchSource:0}: Error finding container c97a62b26e75d3fc1ebe0d57ef35928c11070cd7da007c7a41d134d99d10df55: Status 404 returned error can't find the container with id c97a62b26e75d3fc1ebe0d57ef35928c11070cd7da007c7a41d134d99d10df55 Dec 05 20:33:38 crc kubenswrapper[4744]: I1205 20:33:38.840915 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" event={"ID":"a42ec6d7-1366-4466-b366-b1de21d62ef2","Type":"ContainerStarted","Data":"c97a62b26e75d3fc1ebe0d57ef35928c11070cd7da007c7a41d134d99d10df55"} Dec 05 20:33:39 crc kubenswrapper[4744]: I1205 20:33:39.859880 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"770c8bdc-9e6e-450b-a6c9-d579cada809b","Type":"ContainerStarted","Data":"ba6689546e420622861c98ea4cd2e7023f02cc25403ad90b0fb61b62f2f25ae8"} Dec 05 20:33:39 crc kubenswrapper[4744]: I1205 20:33:39.860703 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:33:39 crc kubenswrapper[4744]: I1205 20:33:39.889845 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.310709396 podStartE2EDuration="8.889824093s" podCreationTimestamp="2025-12-05 20:33:31 +0000 UTC" firstStartedPulling="2025-12-05 20:33:32.670166407 +0000 UTC m=+1382.899977775" lastFinishedPulling="2025-12-05 20:33:39.249281114 +0000 UTC m=+1389.479092472" observedRunningTime="2025-12-05 20:33:39.879519749 +0000 UTC m=+1390.109331137" watchObservedRunningTime="2025-12-05 20:33:39.889824093 +0000 UTC m=+1390.119635461" Dec 05 20:33:40 crc kubenswrapper[4744]: I1205 20:33:40.095895 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 05 20:33:42 crc kubenswrapper[4744]: I1205 20:33:42.480488 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x6d8v"] Dec 05 20:33:42 crc kubenswrapper[4744]: I1205 20:33:42.482235 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:33:42 crc kubenswrapper[4744]: I1205 20:33:42.487032 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7ds9\" (UniqueName: \"kubernetes.io/projected/a527f5e8-bdf9-42d1-8829-55ca955b8148-kube-api-access-z7ds9\") pod \"redhat-operators-x6d8v\" (UID: \"a527f5e8-bdf9-42d1-8829-55ca955b8148\") " pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:33:42 crc kubenswrapper[4744]: I1205 20:33:42.487214 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a527f5e8-bdf9-42d1-8829-55ca955b8148-utilities\") pod \"redhat-operators-x6d8v\" (UID: \"a527f5e8-bdf9-42d1-8829-55ca955b8148\") " pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:33:42 crc kubenswrapper[4744]: I1205 20:33:42.487258 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a527f5e8-bdf9-42d1-8829-55ca955b8148-catalog-content\") pod \"redhat-operators-x6d8v\" (UID: \"a527f5e8-bdf9-42d1-8829-55ca955b8148\") " pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:33:42 crc kubenswrapper[4744]: I1205 20:33:42.511352 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6d8v"] Dec 05 20:33:42 crc kubenswrapper[4744]: I1205 20:33:42.589530 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a527f5e8-bdf9-42d1-8829-55ca955b8148-utilities\") pod \"redhat-operators-x6d8v\" (UID: \"a527f5e8-bdf9-42d1-8829-55ca955b8148\") " pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:33:42 crc kubenswrapper[4744]: I1205 20:33:42.589581 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a527f5e8-bdf9-42d1-8829-55ca955b8148-catalog-content\") pod \"redhat-operators-x6d8v\" (UID: \"a527f5e8-bdf9-42d1-8829-55ca955b8148\") " pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:33:42 crc kubenswrapper[4744]: I1205 20:33:42.589663 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7ds9\" (UniqueName: \"kubernetes.io/projected/a527f5e8-bdf9-42d1-8829-55ca955b8148-kube-api-access-z7ds9\") pod \"redhat-operators-x6d8v\" (UID: \"a527f5e8-bdf9-42d1-8829-55ca955b8148\") " pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:33:42 crc kubenswrapper[4744]: I1205 20:33:42.590406 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a527f5e8-bdf9-42d1-8829-55ca955b8148-catalog-content\") pod \"redhat-operators-x6d8v\" (UID: \"a527f5e8-bdf9-42d1-8829-55ca955b8148\") " pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:33:42 crc kubenswrapper[4744]: I1205 20:33:42.590455 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a527f5e8-bdf9-42d1-8829-55ca955b8148-utilities\") pod \"redhat-operators-x6d8v\" (UID: \"a527f5e8-bdf9-42d1-8829-55ca955b8148\") " pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:33:42 crc kubenswrapper[4744]: I1205 20:33:42.613853 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7ds9\" (UniqueName: \"kubernetes.io/projected/a527f5e8-bdf9-42d1-8829-55ca955b8148-kube-api-access-z7ds9\") pod \"redhat-operators-x6d8v\" (UID: \"a527f5e8-bdf9-42d1-8829-55ca955b8148\") " pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:33:42 crc kubenswrapper[4744]: I1205 20:33:42.816766 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:33:55 crc kubenswrapper[4744]: E1205 20:33:55.391773 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-watcher-api:watcher_latest" Dec 05 20:33:55 crc kubenswrapper[4744]: E1205 20:33:55.392458 4744 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.9:5001/podified-master-centos10/openstack-watcher-api:watcher_latest" Dec 05 20:33:55 crc kubenswrapper[4744]: E1205 20:33:55.392656 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-kuttl-db-sync,Image:38.102.83.9:5001/podified-master-centos10/openstack-watcher-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hd8rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-kuttl-db-sync-992h9_watcher-kuttl-default(a42ec6d7-1366-4466-b366-b1de21d62ef2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 20:33:55 crc kubenswrapper[4744]: E1205 20:33:55.394073 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-kuttl-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" podUID="a42ec6d7-1366-4466-b366-b1de21d62ef2" Dec 05 20:33:55 crc kubenswrapper[4744]: I1205 20:33:55.806463 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6d8v"] Dec 05 20:33:56 crc kubenswrapper[4744]: I1205 20:33:56.006688 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6d8v" event={"ID":"a527f5e8-bdf9-42d1-8829-55ca955b8148","Type":"ContainerStarted","Data":"f2d810efe57ea77ce68f237a6898825d08e70f9588354bdec75101b2e2196ac9"} Dec 05 20:33:56 crc kubenswrapper[4744]: I1205 20:33:56.006747 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6d8v" event={"ID":"a527f5e8-bdf9-42d1-8829-55ca955b8148","Type":"ContainerStarted","Data":"b3c34e7816352f610b06f25d0a6e4f17faf195f8212a0e7efd1124c25010a05b"} Dec 05 20:33:56 crc kubenswrapper[4744]: E1205 20:33:56.009014 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-kuttl-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.9:5001/podified-master-centos10/openstack-watcher-api:watcher_latest\\\"\"" pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" podUID="a42ec6d7-1366-4466-b366-b1de21d62ef2" Dec 05 20:33:57 crc kubenswrapper[4744]: I1205 20:33:57.015494 4744 generic.go:334] "Generic (PLEG): container finished" podID="a527f5e8-bdf9-42d1-8829-55ca955b8148" containerID="f2d810efe57ea77ce68f237a6898825d08e70f9588354bdec75101b2e2196ac9" exitCode=0 Dec 05 20:33:57 crc kubenswrapper[4744]: I1205 20:33:57.016043 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6d8v" event={"ID":"a527f5e8-bdf9-42d1-8829-55ca955b8148","Type":"ContainerDied","Data":"f2d810efe57ea77ce68f237a6898825d08e70f9588354bdec75101b2e2196ac9"} Dec 05 20:33:58 crc kubenswrapper[4744]: I1205 20:33:58.026204 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6d8v" event={"ID":"a527f5e8-bdf9-42d1-8829-55ca955b8148","Type":"ContainerStarted","Data":"811c4472f15d662a7bfbbcafb789761b6c5a8aa2d0a00818951a9f707e4a9dae"} Dec 05 20:33:59 crc kubenswrapper[4744]: I1205 20:33:59.036881 4744 generic.go:334] "Generic (PLEG): container finished" podID="a527f5e8-bdf9-42d1-8829-55ca955b8148" containerID="811c4472f15d662a7bfbbcafb789761b6c5a8aa2d0a00818951a9f707e4a9dae" exitCode=0 Dec 05 20:33:59 crc kubenswrapper[4744]: I1205 20:33:59.036920 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6d8v" event={"ID":"a527f5e8-bdf9-42d1-8829-55ca955b8148","Type":"ContainerDied","Data":"811c4472f15d662a7bfbbcafb789761b6c5a8aa2d0a00818951a9f707e4a9dae"} Dec 05 20:34:00 crc kubenswrapper[4744]: I1205 20:34:00.047475 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6d8v" event={"ID":"a527f5e8-bdf9-42d1-8829-55ca955b8148","Type":"ContainerStarted","Data":"7c93c4a6414956200f2efba459e9864ace940ef3cd59daca3e5a9e2a6a199b02"} Dec 05 20:34:00 crc kubenswrapper[4744]: I1205 20:34:00.067583 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x6d8v" podStartSLOduration=15.605143865 podStartE2EDuration="18.06756105s" podCreationTimestamp="2025-12-05 20:33:42 +0000 UTC" firstStartedPulling="2025-12-05 20:33:57.01946288 +0000 UTC m=+1407.249274248" lastFinishedPulling="2025-12-05 20:33:59.481880025 +0000 UTC m=+1409.711691433" observedRunningTime="2025-12-05 20:34:00.061519529 +0000 UTC m=+1410.291330907" watchObservedRunningTime="2025-12-05 20:34:00.06756105 +0000 UTC m=+1410.297372428" Dec 05 20:34:02 crc kubenswrapper[4744]: I1205 20:34:02.265813 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:02 crc kubenswrapper[4744]: I1205 20:34:02.817939 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:34:02 crc kubenswrapper[4744]: I1205 20:34:02.817993 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:34:03 crc kubenswrapper[4744]: I1205 20:34:03.857633 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x6d8v" podUID="a527f5e8-bdf9-42d1-8829-55ca955b8148" containerName="registry-server" probeResult="failure" output=< Dec 05 20:34:03 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Dec 05 20:34:03 crc kubenswrapper[4744]: > Dec 05 20:34:11 crc kubenswrapper[4744]: I1205 20:34:11.145794 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" event={"ID":"a42ec6d7-1366-4466-b366-b1de21d62ef2","Type":"ContainerStarted","Data":"53c6dc96f48fffaea4f1262bfbca8fe80ab2aa9fe1ace6003637c6ebc4a8b721"} Dec 05 20:34:11 crc kubenswrapper[4744]: I1205 20:34:11.174939 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" podStartSLOduration=2.212620146 podStartE2EDuration="34.174920366s" podCreationTimestamp="2025-12-05 20:33:37 +0000 UTC" firstStartedPulling="2025-12-05 20:33:38.579233003 +0000 UTC m=+1388.809044371" lastFinishedPulling="2025-12-05 20:34:10.541533223 +0000 UTC m=+1420.771344591" observedRunningTime="2025-12-05 20:34:11.170476619 +0000 UTC m=+1421.400287997" watchObservedRunningTime="2025-12-05 20:34:11.174920366 +0000 UTC m=+1421.404731734" Dec 05 20:34:12 crc kubenswrapper[4744]: I1205 20:34:12.866776 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:34:12 crc kubenswrapper[4744]: I1205 20:34:12.919458 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:34:14 crc kubenswrapper[4744]: I1205 20:34:14.182853 4744 generic.go:334] "Generic (PLEG): container finished" podID="a42ec6d7-1366-4466-b366-b1de21d62ef2" containerID="53c6dc96f48fffaea4f1262bfbca8fe80ab2aa9fe1ace6003637c6ebc4a8b721" exitCode=0 Dec 05 20:34:14 crc kubenswrapper[4744]: I1205 20:34:14.182920 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" event={"ID":"a42ec6d7-1366-4466-b366-b1de21d62ef2","Type":"ContainerDied","Data":"53c6dc96f48fffaea4f1262bfbca8fe80ab2aa9fe1ace6003637c6ebc4a8b721"} Dec 05 20:34:14 crc kubenswrapper[4744]: I1205 20:34:14.656923 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x6d8v"] Dec 05 20:34:14 crc kubenswrapper[4744]: I1205 20:34:14.657168 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x6d8v" podUID="a527f5e8-bdf9-42d1-8829-55ca955b8148" containerName="registry-server" containerID="cri-o://7c93c4a6414956200f2efba459e9864ace940ef3cd59daca3e5a9e2a6a199b02" gracePeriod=2 Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.497187 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.601160 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.608355 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-config-data\") pod \"a42ec6d7-1366-4466-b366-b1de21d62ef2\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.608559 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-db-sync-config-data\") pod \"a42ec6d7-1366-4466-b366-b1de21d62ef2\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.608730 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-combined-ca-bundle\") pod \"a42ec6d7-1366-4466-b366-b1de21d62ef2\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.608930 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd8rv\" (UniqueName: \"kubernetes.io/projected/a42ec6d7-1366-4466-b366-b1de21d62ef2-kube-api-access-hd8rv\") pod \"a42ec6d7-1366-4466-b366-b1de21d62ef2\" (UID: \"a42ec6d7-1366-4466-b366-b1de21d62ef2\") " Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.613760 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a42ec6d7-1366-4466-b366-b1de21d62ef2-kube-api-access-hd8rv" (OuterVolumeSpecName: "kube-api-access-hd8rv") pod "a42ec6d7-1366-4466-b366-b1de21d62ef2" (UID: "a42ec6d7-1366-4466-b366-b1de21d62ef2"). InnerVolumeSpecName "kube-api-access-hd8rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.614436 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a42ec6d7-1366-4466-b366-b1de21d62ef2" (UID: "a42ec6d7-1366-4466-b366-b1de21d62ef2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.636493 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a42ec6d7-1366-4466-b366-b1de21d62ef2" (UID: "a42ec6d7-1366-4466-b366-b1de21d62ef2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.650537 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-config-data" (OuterVolumeSpecName: "config-data") pod "a42ec6d7-1366-4466-b366-b1de21d62ef2" (UID: "a42ec6d7-1366-4466-b366-b1de21d62ef2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.710269 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a527f5e8-bdf9-42d1-8829-55ca955b8148-utilities\") pod \"a527f5e8-bdf9-42d1-8829-55ca955b8148\" (UID: \"a527f5e8-bdf9-42d1-8829-55ca955b8148\") " Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.710359 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a527f5e8-bdf9-42d1-8829-55ca955b8148-catalog-content\") pod \"a527f5e8-bdf9-42d1-8829-55ca955b8148\" (UID: \"a527f5e8-bdf9-42d1-8829-55ca955b8148\") " Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.710576 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7ds9\" (UniqueName: \"kubernetes.io/projected/a527f5e8-bdf9-42d1-8829-55ca955b8148-kube-api-access-z7ds9\") pod \"a527f5e8-bdf9-42d1-8829-55ca955b8148\" (UID: \"a527f5e8-bdf9-42d1-8829-55ca955b8148\") " Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.710952 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.710980 4744 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.710994 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a42ec6d7-1366-4466-b366-b1de21d62ef2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.711007 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd8rv\" (UniqueName: \"kubernetes.io/projected/a42ec6d7-1366-4466-b366-b1de21d62ef2-kube-api-access-hd8rv\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.711812 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a527f5e8-bdf9-42d1-8829-55ca955b8148-utilities" (OuterVolumeSpecName: "utilities") pod "a527f5e8-bdf9-42d1-8829-55ca955b8148" (UID: "a527f5e8-bdf9-42d1-8829-55ca955b8148"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.715293 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a527f5e8-bdf9-42d1-8829-55ca955b8148-kube-api-access-z7ds9" (OuterVolumeSpecName: "kube-api-access-z7ds9") pod "a527f5e8-bdf9-42d1-8829-55ca955b8148" (UID: "a527f5e8-bdf9-42d1-8829-55ca955b8148"). InnerVolumeSpecName "kube-api-access-z7ds9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.812799 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7ds9\" (UniqueName: \"kubernetes.io/projected/a527f5e8-bdf9-42d1-8829-55ca955b8148-kube-api-access-z7ds9\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.812870 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a527f5e8-bdf9-42d1-8829-55ca955b8148-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.832272 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a527f5e8-bdf9-42d1-8829-55ca955b8148-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a527f5e8-bdf9-42d1-8829-55ca955b8148" (UID: "a527f5e8-bdf9-42d1-8829-55ca955b8148"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:15 crc kubenswrapper[4744]: I1205 20:34:15.914102 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a527f5e8-bdf9-42d1-8829-55ca955b8148-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.207839 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" event={"ID":"a42ec6d7-1366-4466-b366-b1de21d62ef2","Type":"ContainerDied","Data":"c97a62b26e75d3fc1ebe0d57ef35928c11070cd7da007c7a41d134d99d10df55"} Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.207887 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c97a62b26e75d3fc1ebe0d57ef35928c11070cd7da007c7a41d134d99d10df55" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.207887 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-992h9" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.212833 4744 generic.go:334] "Generic (PLEG): container finished" podID="a527f5e8-bdf9-42d1-8829-55ca955b8148" containerID="7c93c4a6414956200f2efba459e9864ace940ef3cd59daca3e5a9e2a6a199b02" exitCode=0 Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.212888 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6d8v" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.212900 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6d8v" event={"ID":"a527f5e8-bdf9-42d1-8829-55ca955b8148","Type":"ContainerDied","Data":"7c93c4a6414956200f2efba459e9864ace940ef3cd59daca3e5a9e2a6a199b02"} Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.212963 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6d8v" event={"ID":"a527f5e8-bdf9-42d1-8829-55ca955b8148","Type":"ContainerDied","Data":"b3c34e7816352f610b06f25d0a6e4f17faf195f8212a0e7efd1124c25010a05b"} Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.212997 4744 scope.go:117] "RemoveContainer" containerID="7c93c4a6414956200f2efba459e9864ace940ef3cd59daca3e5a9e2a6a199b02" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.243747 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x6d8v"] Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.249250 4744 scope.go:117] "RemoveContainer" containerID="811c4472f15d662a7bfbbcafb789761b6c5a8aa2d0a00818951a9f707e4a9dae" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.259143 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x6d8v"] Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.292749 4744 scope.go:117] "RemoveContainer" containerID="f2d810efe57ea77ce68f237a6898825d08e70f9588354bdec75101b2e2196ac9" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.321006 4744 scope.go:117] "RemoveContainer" containerID="7c93c4a6414956200f2efba459e9864ace940ef3cd59daca3e5a9e2a6a199b02" Dec 05 20:34:16 crc kubenswrapper[4744]: E1205 20:34:16.321489 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c93c4a6414956200f2efba459e9864ace940ef3cd59daca3e5a9e2a6a199b02\": container with ID starting with 7c93c4a6414956200f2efba459e9864ace940ef3cd59daca3e5a9e2a6a199b02 not found: ID does not exist" containerID="7c93c4a6414956200f2efba459e9864ace940ef3cd59daca3e5a9e2a6a199b02" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.321538 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c93c4a6414956200f2efba459e9864ace940ef3cd59daca3e5a9e2a6a199b02"} err="failed to get container status \"7c93c4a6414956200f2efba459e9864ace940ef3cd59daca3e5a9e2a6a199b02\": rpc error: code = NotFound desc = could not find container \"7c93c4a6414956200f2efba459e9864ace940ef3cd59daca3e5a9e2a6a199b02\": container with ID starting with 7c93c4a6414956200f2efba459e9864ace940ef3cd59daca3e5a9e2a6a199b02 not found: ID does not exist" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.321571 4744 scope.go:117] "RemoveContainer" containerID="811c4472f15d662a7bfbbcafb789761b6c5a8aa2d0a00818951a9f707e4a9dae" Dec 05 20:34:16 crc kubenswrapper[4744]: E1205 20:34:16.322428 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"811c4472f15d662a7bfbbcafb789761b6c5a8aa2d0a00818951a9f707e4a9dae\": container with ID starting with 811c4472f15d662a7bfbbcafb789761b6c5a8aa2d0a00818951a9f707e4a9dae not found: ID does not exist" containerID="811c4472f15d662a7bfbbcafb789761b6c5a8aa2d0a00818951a9f707e4a9dae" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.322458 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"811c4472f15d662a7bfbbcafb789761b6c5a8aa2d0a00818951a9f707e4a9dae"} err="failed to get container status \"811c4472f15d662a7bfbbcafb789761b6c5a8aa2d0a00818951a9f707e4a9dae\": rpc error: code = NotFound desc = could not find container \"811c4472f15d662a7bfbbcafb789761b6c5a8aa2d0a00818951a9f707e4a9dae\": container with ID starting with 811c4472f15d662a7bfbbcafb789761b6c5a8aa2d0a00818951a9f707e4a9dae not found: ID does not exist" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.322482 4744 scope.go:117] "RemoveContainer" containerID="f2d810efe57ea77ce68f237a6898825d08e70f9588354bdec75101b2e2196ac9" Dec 05 20:34:16 crc kubenswrapper[4744]: E1205 20:34:16.322900 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2d810efe57ea77ce68f237a6898825d08e70f9588354bdec75101b2e2196ac9\": container with ID starting with f2d810efe57ea77ce68f237a6898825d08e70f9588354bdec75101b2e2196ac9 not found: ID does not exist" containerID="f2d810efe57ea77ce68f237a6898825d08e70f9588354bdec75101b2e2196ac9" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.322945 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2d810efe57ea77ce68f237a6898825d08e70f9588354bdec75101b2e2196ac9"} err="failed to get container status \"f2d810efe57ea77ce68f237a6898825d08e70f9588354bdec75101b2e2196ac9\": rpc error: code = NotFound desc = could not find container \"f2d810efe57ea77ce68f237a6898825d08e70f9588354bdec75101b2e2196ac9\": container with ID starting with f2d810efe57ea77ce68f237a6898825d08e70f9588354bdec75101b2e2196ac9 not found: ID does not exist" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.621619 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:34:16 crc kubenswrapper[4744]: E1205 20:34:16.622850 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42ec6d7-1366-4466-b366-b1de21d62ef2" containerName="watcher-kuttl-db-sync" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.622978 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42ec6d7-1366-4466-b366-b1de21d62ef2" containerName="watcher-kuttl-db-sync" Dec 05 20:34:16 crc kubenswrapper[4744]: E1205 20:34:16.623061 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a527f5e8-bdf9-42d1-8829-55ca955b8148" containerName="extract-content" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.623130 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a527f5e8-bdf9-42d1-8829-55ca955b8148" containerName="extract-content" Dec 05 20:34:16 crc kubenswrapper[4744]: E1205 20:34:16.623202 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a527f5e8-bdf9-42d1-8829-55ca955b8148" containerName="registry-server" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.623267 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a527f5e8-bdf9-42d1-8829-55ca955b8148" containerName="registry-server" Dec 05 20:34:16 crc kubenswrapper[4744]: E1205 20:34:16.623433 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a527f5e8-bdf9-42d1-8829-55ca955b8148" containerName="extract-utilities" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.623506 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a527f5e8-bdf9-42d1-8829-55ca955b8148" containerName="extract-utilities" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.623817 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a527f5e8-bdf9-42d1-8829-55ca955b8148" containerName="registry-server" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.623935 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42ec6d7-1366-4466-b366-b1de21d62ef2" containerName="watcher-kuttl-db-sync" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.624826 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.647813 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-6fw64" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.648073 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.682140 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.701462 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.701582 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.703736 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.703994 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.705041 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.708018 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.711032 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.725108 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.726019 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.727318 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.727361 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfjp\" (UniqueName: \"kubernetes.io/projected/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-kube-api-access-wlfjp\") pod \"watcher-kuttl-applier-0\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.727390 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.829049 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d08f844-648a-40f2-b722-25f3a8c4e502-logs\") pod \"watcher-kuttl-api-0\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.829128 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.829193 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.829216 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.829241 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d99c95d4-70dd-484d-9fca-ca76589af16e-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.829317 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qx2k\" (UniqueName: \"kubernetes.io/projected/d99c95d4-70dd-484d-9fca-ca76589af16e-kube-api-access-5qx2k\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.829343 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.829399 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.829430 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65rvf\" (UniqueName: \"kubernetes.io/projected/8d08f844-648a-40f2-b722-25f3a8c4e502-kube-api-access-65rvf\") pod \"watcher-kuttl-api-0\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.829482 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.829512 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfjp\" (UniqueName: \"kubernetes.io/projected/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-kube-api-access-wlfjp\") pod \"watcher-kuttl-applier-0\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.829572 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.829646 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.829691 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.830274 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.833486 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.833860 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.849863 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfjp\" (UniqueName: \"kubernetes.io/projected/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-kube-api-access-wlfjp\") pod \"watcher-kuttl-applier-0\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.930562 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.930622 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d08f844-648a-40f2-b722-25f3a8c4e502-logs\") pod \"watcher-kuttl-api-0\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.930644 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.930671 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.930691 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.930708 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d99c95d4-70dd-484d-9fca-ca76589af16e-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.930724 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qx2k\" (UniqueName: \"kubernetes.io/projected/d99c95d4-70dd-484d-9fca-ca76589af16e-kube-api-access-5qx2k\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.930741 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.930764 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65rvf\" (UniqueName: \"kubernetes.io/projected/8d08f844-648a-40f2-b722-25f3a8c4e502-kube-api-access-65rvf\") pod \"watcher-kuttl-api-0\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.930784 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.931848 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d08f844-648a-40f2-b722-25f3a8c4e502-logs\") pod \"watcher-kuttl-api-0\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.932019 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d99c95d4-70dd-484d-9fca-ca76589af16e-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.934240 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.934260 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.934365 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.940724 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.941046 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.943880 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.946927 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.947461 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65rvf\" (UniqueName: \"kubernetes.io/projected/8d08f844-648a-40f2-b722-25f3a8c4e502-kube-api-access-65rvf\") pod \"watcher-kuttl-api-0\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:16 crc kubenswrapper[4744]: I1205 20:34:16.948568 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qx2k\" (UniqueName: \"kubernetes.io/projected/d99c95d4-70dd-484d-9fca-ca76589af16e-kube-api-access-5qx2k\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:17 crc kubenswrapper[4744]: I1205 20:34:17.020365 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:17 crc kubenswrapper[4744]: I1205 20:34:17.027602 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:17 crc kubenswrapper[4744]: I1205 20:34:17.461557 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:34:17 crc kubenswrapper[4744]: W1205 20:34:17.466117 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33dfed55_19a0_4b9e_b75e_f7ba3d14efb4.slice/crio-00675d49bc6918b1944adb94eaf8e1d4e7431f6d887f5afbb15fc8619937c23d WatchSource:0}: Error finding container 00675d49bc6918b1944adb94eaf8e1d4e7431f6d887f5afbb15fc8619937c23d: Status 404 returned error can't find the container with id 00675d49bc6918b1944adb94eaf8e1d4e7431f6d887f5afbb15fc8619937c23d Dec 05 20:34:17 crc kubenswrapper[4744]: I1205 20:34:17.631683 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:34:17 crc kubenswrapper[4744]: W1205 20:34:17.646122 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd99c95d4_70dd_484d_9fca_ca76589af16e.slice/crio-f28d3a4721e33a0f7cb46ad42d5b1b4a08a0a6c8ed71ba46f11efbf63b7fb9fc WatchSource:0}: Error finding container f28d3a4721e33a0f7cb46ad42d5b1b4a08a0a6c8ed71ba46f11efbf63b7fb9fc: Status 404 returned error can't find the container with id f28d3a4721e33a0f7cb46ad42d5b1b4a08a0a6c8ed71ba46f11efbf63b7fb9fc Dec 05 20:34:17 crc kubenswrapper[4744]: I1205 20:34:17.658617 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:34:17 crc kubenswrapper[4744]: W1205 20:34:17.664776 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d08f844_648a_40f2_b722_25f3a8c4e502.slice/crio-dd60fdf6ec234677f79db99fce37744b4dacaff1b9967636661697c92b352e44 WatchSource:0}: Error finding container dd60fdf6ec234677f79db99fce37744b4dacaff1b9967636661697c92b352e44: Status 404 returned error can't find the container with id dd60fdf6ec234677f79db99fce37744b4dacaff1b9967636661697c92b352e44 Dec 05 20:34:18 crc kubenswrapper[4744]: I1205 20:34:18.101457 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a527f5e8-bdf9-42d1-8829-55ca955b8148" path="/var/lib/kubelet/pods/a527f5e8-bdf9-42d1-8829-55ca955b8148/volumes" Dec 05 20:34:18 crc kubenswrapper[4744]: I1205 20:34:18.267668 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"8d08f844-648a-40f2-b722-25f3a8c4e502","Type":"ContainerStarted","Data":"ef91adae48919d11880208d226567f3646d3ea892a030e49c3ec23d0025a918a"} Dec 05 20:34:18 crc kubenswrapper[4744]: I1205 20:34:18.268013 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"8d08f844-648a-40f2-b722-25f3a8c4e502","Type":"ContainerStarted","Data":"dd60fdf6ec234677f79db99fce37744b4dacaff1b9967636661697c92b352e44"} Dec 05 20:34:18 crc kubenswrapper[4744]: I1205 20:34:18.268670 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"d99c95d4-70dd-484d-9fca-ca76589af16e","Type":"ContainerStarted","Data":"f28d3a4721e33a0f7cb46ad42d5b1b4a08a0a6c8ed71ba46f11efbf63b7fb9fc"} Dec 05 20:34:18 crc kubenswrapper[4744]: I1205 20:34:18.269548 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4","Type":"ContainerStarted","Data":"00675d49bc6918b1944adb94eaf8e1d4e7431f6d887f5afbb15fc8619937c23d"} Dec 05 20:34:19 crc kubenswrapper[4744]: I1205 20:34:19.280780 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"8d08f844-648a-40f2-b722-25f3a8c4e502","Type":"ContainerStarted","Data":"0afb5945cea3f16b569bf5cd91c0637c21625d8a5e8c9a149d03ffe4f5212776"} Dec 05 20:34:19 crc kubenswrapper[4744]: I1205 20:34:19.281353 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:19 crc kubenswrapper[4744]: I1205 20:34:19.283428 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"d99c95d4-70dd-484d-9fca-ca76589af16e","Type":"ContainerStarted","Data":"2fc2ecf6fbf16d6670d9de5e228ddb4b018ce807ec347dfaf61e96dfd4e5bf31"} Dec 05 20:34:19 crc kubenswrapper[4744]: I1205 20:34:19.286162 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4","Type":"ContainerStarted","Data":"89196f127391c2d73c3ba9af37fe90466ac24afbda421da1d9959118c97d9bbf"} Dec 05 20:34:19 crc kubenswrapper[4744]: I1205 20:34:19.306653 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=3.306623099 podStartE2EDuration="3.306623099s" podCreationTimestamp="2025-12-05 20:34:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:19.298612262 +0000 UTC m=+1429.528423650" watchObservedRunningTime="2025-12-05 20:34:19.306623099 +0000 UTC m=+1429.536434477" Dec 05 20:34:19 crc kubenswrapper[4744]: I1205 20:34:19.317788 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.242391717 podStartE2EDuration="3.317759772s" podCreationTimestamp="2025-12-05 20:34:16 +0000 UTC" firstStartedPulling="2025-12-05 20:34:17.469108007 +0000 UTC m=+1427.698919365" lastFinishedPulling="2025-12-05 20:34:18.544476052 +0000 UTC m=+1428.774287420" observedRunningTime="2025-12-05 20:34:19.315373433 +0000 UTC m=+1429.545184791" watchObservedRunningTime="2025-12-05 20:34:19.317759772 +0000 UTC m=+1429.547571150" Dec 05 20:34:19 crc kubenswrapper[4744]: I1205 20:34:19.335190 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.449300615 podStartE2EDuration="3.335163129s" podCreationTimestamp="2025-12-05 20:34:16 +0000 UTC" firstStartedPulling="2025-12-05 20:34:17.66033471 +0000 UTC m=+1427.890146078" lastFinishedPulling="2025-12-05 20:34:18.546197224 +0000 UTC m=+1428.776008592" observedRunningTime="2025-12-05 20:34:19.334415821 +0000 UTC m=+1429.564227189" watchObservedRunningTime="2025-12-05 20:34:19.335163129 +0000 UTC m=+1429.564974517" Dec 05 20:34:19 crc kubenswrapper[4744]: I1205 20:34:19.806960 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:34:19 crc kubenswrapper[4744]: I1205 20:34:19.807072 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:34:21 crc kubenswrapper[4744]: I1205 20:34:21.520785 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:21 crc kubenswrapper[4744]: I1205 20:34:21.947586 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:22 crc kubenswrapper[4744]: I1205 20:34:22.020494 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:26 crc kubenswrapper[4744]: I1205 20:34:26.947866 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:26 crc kubenswrapper[4744]: I1205 20:34:26.973244 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:27 crc kubenswrapper[4744]: I1205 20:34:27.021225 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:27 crc kubenswrapper[4744]: I1205 20:34:27.027523 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:27 crc kubenswrapper[4744]: I1205 20:34:27.027872 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:27 crc kubenswrapper[4744]: I1205 20:34:27.064509 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:27 crc kubenswrapper[4744]: I1205 20:34:27.359319 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:27 crc kubenswrapper[4744]: I1205 20:34:27.363465 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:27 crc kubenswrapper[4744]: I1205 20:34:27.386664 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:27 crc kubenswrapper[4744]: I1205 20:34:27.400244 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:29 crc kubenswrapper[4744]: I1205 20:34:29.400731 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:34:29 crc kubenswrapper[4744]: I1205 20:34:29.401007 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="ceilometer-central-agent" containerID="cri-o://1f08ed016ad9601e7551caa7e36f5e0a2bb6aad562b28fe994bcd75cc65506b8" gracePeriod=30 Dec 05 20:34:29 crc kubenswrapper[4744]: I1205 20:34:29.401121 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="sg-core" containerID="cri-o://2cd0c79b6cc1c0c63d5ceeb4bb0a03f02155f46079799783f7d3bf06d69d600d" gracePeriod=30 Dec 05 20:34:29 crc kubenswrapper[4744]: I1205 20:34:29.401168 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="ceilometer-notification-agent" containerID="cri-o://b599959ed856ff31b18a3034f0429e63cbc931ade4366d9c691eaf7ae51e3d66" gracePeriod=30 Dec 05 20:34:29 crc kubenswrapper[4744]: I1205 20:34:29.401155 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="proxy-httpd" containerID="cri-o://ba6689546e420622861c98ea4cd2e7023f02cc25403ad90b0fb61b62f2f25ae8" gracePeriod=30 Dec 05 20:34:29 crc kubenswrapper[4744]: I1205 20:34:29.911019 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-992h9"] Dec 05 20:34:29 crc kubenswrapper[4744]: I1205 20:34:29.916865 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-992h9"] Dec 05 20:34:29 crc kubenswrapper[4744]: I1205 20:34:29.951790 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:34:29 crc kubenswrapper[4744]: I1205 20:34:29.964362 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher43c1-account-delete-66tz9"] Dec 05 20:34:29 crc kubenswrapper[4744]: I1205 20:34:29.965585 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher43c1-account-delete-66tz9" Dec 05 20:34:29 crc kubenswrapper[4744]: I1205 20:34:29.974951 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher43c1-account-delete-66tz9"] Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.017269 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.017502 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="33dfed55-19a0-4b9e-b75e-f7ba3d14efb4" containerName="watcher-applier" containerID="cri-o://89196f127391c2d73c3ba9af37fe90466ac24afbda421da1d9959118c97d9bbf" gracePeriod=30 Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.040155 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.040459 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="8d08f844-648a-40f2-b722-25f3a8c4e502" containerName="watcher-kuttl-api-log" containerID="cri-o://ef91adae48919d11880208d226567f3646d3ea892a030e49c3ec23d0025a918a" gracePeriod=30 Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.040824 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="8d08f844-648a-40f2-b722-25f3a8c4e502" containerName="watcher-api" containerID="cri-o://0afb5945cea3f16b569bf5cd91c0637c21625d8a5e8c9a149d03ffe4f5212776" gracePeriod=30 Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.114153 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a42ec6d7-1366-4466-b366-b1de21d62ef2" path="/var/lib/kubelet/pods/a42ec6d7-1366-4466-b366-b1de21d62ef2/volumes" Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.147050 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fk8p\" (UniqueName: \"kubernetes.io/projected/bc65c044-7e38-4173-9128-fe84d1866ae7-kube-api-access-7fk8p\") pod \"watcher43c1-account-delete-66tz9\" (UID: \"bc65c044-7e38-4173-9128-fe84d1866ae7\") " pod="watcher-kuttl-default/watcher43c1-account-delete-66tz9" Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.147220 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc65c044-7e38-4173-9128-fe84d1866ae7-operator-scripts\") pod \"watcher43c1-account-delete-66tz9\" (UID: \"bc65c044-7e38-4173-9128-fe84d1866ae7\") " pod="watcher-kuttl-default/watcher43c1-account-delete-66tz9" Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.248240 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fk8p\" (UniqueName: \"kubernetes.io/projected/bc65c044-7e38-4173-9128-fe84d1866ae7-kube-api-access-7fk8p\") pod \"watcher43c1-account-delete-66tz9\" (UID: \"bc65c044-7e38-4173-9128-fe84d1866ae7\") " pod="watcher-kuttl-default/watcher43c1-account-delete-66tz9" Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.248870 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc65c044-7e38-4173-9128-fe84d1866ae7-operator-scripts\") pod \"watcher43c1-account-delete-66tz9\" (UID: \"bc65c044-7e38-4173-9128-fe84d1866ae7\") " pod="watcher-kuttl-default/watcher43c1-account-delete-66tz9" Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.250223 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc65c044-7e38-4173-9128-fe84d1866ae7-operator-scripts\") pod \"watcher43c1-account-delete-66tz9\" (UID: \"bc65c044-7e38-4173-9128-fe84d1866ae7\") " pod="watcher-kuttl-default/watcher43c1-account-delete-66tz9" Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.266774 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fk8p\" (UniqueName: \"kubernetes.io/projected/bc65c044-7e38-4173-9128-fe84d1866ae7-kube-api-access-7fk8p\") pod \"watcher43c1-account-delete-66tz9\" (UID: \"bc65c044-7e38-4173-9128-fe84d1866ae7\") " pod="watcher-kuttl-default/watcher43c1-account-delete-66tz9" Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.281269 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher43c1-account-delete-66tz9" Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.500829 4744 generic.go:334] "Generic (PLEG): container finished" podID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerID="ba6689546e420622861c98ea4cd2e7023f02cc25403ad90b0fb61b62f2f25ae8" exitCode=0 Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.500861 4744 generic.go:334] "Generic (PLEG): container finished" podID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerID="2cd0c79b6cc1c0c63d5ceeb4bb0a03f02155f46079799783f7d3bf06d69d600d" exitCode=2 Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.500871 4744 generic.go:334] "Generic (PLEG): container finished" podID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerID="1f08ed016ad9601e7551caa7e36f5e0a2bb6aad562b28fe994bcd75cc65506b8" exitCode=0 Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.500939 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"770c8bdc-9e6e-450b-a6c9-d579cada809b","Type":"ContainerDied","Data":"ba6689546e420622861c98ea4cd2e7023f02cc25403ad90b0fb61b62f2f25ae8"} Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.500966 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"770c8bdc-9e6e-450b-a6c9-d579cada809b","Type":"ContainerDied","Data":"2cd0c79b6cc1c0c63d5ceeb4bb0a03f02155f46079799783f7d3bf06d69d600d"} Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.500975 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"770c8bdc-9e6e-450b-a6c9-d579cada809b","Type":"ContainerDied","Data":"1f08ed016ad9601e7551caa7e36f5e0a2bb6aad562b28fe994bcd75cc65506b8"} Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.505160 4744 generic.go:334] "Generic (PLEG): container finished" podID="8d08f844-648a-40f2-b722-25f3a8c4e502" containerID="ef91adae48919d11880208d226567f3646d3ea892a030e49c3ec23d0025a918a" exitCode=143 Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.505354 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="d99c95d4-70dd-484d-9fca-ca76589af16e" containerName="watcher-decision-engine" containerID="cri-o://2fc2ecf6fbf16d6670d9de5e228ddb4b018ce807ec347dfaf61e96dfd4e5bf31" gracePeriod=30 Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.505423 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"8d08f844-648a-40f2-b722-25f3a8c4e502","Type":"ContainerDied","Data":"ef91adae48919d11880208d226567f3646d3ea892a030e49c3ec23d0025a918a"} Dec 05 20:34:30 crc kubenswrapper[4744]: I1205 20:34:30.608578 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher43c1-account-delete-66tz9"] Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.292877 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.373714 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-custom-prometheus-ca\") pod \"8d08f844-648a-40f2-b722-25f3a8c4e502\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.373793 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-combined-ca-bundle\") pod \"8d08f844-648a-40f2-b722-25f3a8c4e502\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.373855 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d08f844-648a-40f2-b722-25f3a8c4e502-logs\") pod \"8d08f844-648a-40f2-b722-25f3a8c4e502\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.373882 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-config-data\") pod \"8d08f844-648a-40f2-b722-25f3a8c4e502\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.373927 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65rvf\" (UniqueName: \"kubernetes.io/projected/8d08f844-648a-40f2-b722-25f3a8c4e502-kube-api-access-65rvf\") pod \"8d08f844-648a-40f2-b722-25f3a8c4e502\" (UID: \"8d08f844-648a-40f2-b722-25f3a8c4e502\") " Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.374275 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d08f844-648a-40f2-b722-25f3a8c4e502-logs" (OuterVolumeSpecName: "logs") pod "8d08f844-648a-40f2-b722-25f3a8c4e502" (UID: "8d08f844-648a-40f2-b722-25f3a8c4e502"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.379007 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d08f844-648a-40f2-b722-25f3a8c4e502-kube-api-access-65rvf" (OuterVolumeSpecName: "kube-api-access-65rvf") pod "8d08f844-648a-40f2-b722-25f3a8c4e502" (UID: "8d08f844-648a-40f2-b722-25f3a8c4e502"). InnerVolumeSpecName "kube-api-access-65rvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.398444 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d08f844-648a-40f2-b722-25f3a8c4e502" (UID: "8d08f844-648a-40f2-b722-25f3a8c4e502"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.413404 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "8d08f844-648a-40f2-b722-25f3a8c4e502" (UID: "8d08f844-648a-40f2-b722-25f3a8c4e502"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.430445 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-config-data" (OuterVolumeSpecName: "config-data") pod "8d08f844-648a-40f2-b722-25f3a8c4e502" (UID: "8d08f844-648a-40f2-b722-25f3a8c4e502"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.475339 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.475382 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d08f844-648a-40f2-b722-25f3a8c4e502-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.475393 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.475406 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65rvf\" (UniqueName: \"kubernetes.io/projected/8d08f844-648a-40f2-b722-25f3a8c4e502-kube-api-access-65rvf\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.475418 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8d08f844-648a-40f2-b722-25f3a8c4e502-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.516999 4744 generic.go:334] "Generic (PLEG): container finished" podID="bc65c044-7e38-4173-9128-fe84d1866ae7" containerID="8efd5c115428d8639f692cebc8eae23d009bf945dc3f5a81f0247ca2b8ad33d7" exitCode=0 Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.517128 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher43c1-account-delete-66tz9" event={"ID":"bc65c044-7e38-4173-9128-fe84d1866ae7","Type":"ContainerDied","Data":"8efd5c115428d8639f692cebc8eae23d009bf945dc3f5a81f0247ca2b8ad33d7"} Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.517193 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher43c1-account-delete-66tz9" event={"ID":"bc65c044-7e38-4173-9128-fe84d1866ae7","Type":"ContainerStarted","Data":"df9134c27807409dd87816e09aac0bc27def71221adfe6475b2aa791f542e531"} Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.520850 4744 generic.go:334] "Generic (PLEG): container finished" podID="8d08f844-648a-40f2-b722-25f3a8c4e502" containerID="0afb5945cea3f16b569bf5cd91c0637c21625d8a5e8c9a149d03ffe4f5212776" exitCode=0 Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.520890 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"8d08f844-648a-40f2-b722-25f3a8c4e502","Type":"ContainerDied","Data":"0afb5945cea3f16b569bf5cd91c0637c21625d8a5e8c9a149d03ffe4f5212776"} Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.520917 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"8d08f844-648a-40f2-b722-25f3a8c4e502","Type":"ContainerDied","Data":"dd60fdf6ec234677f79db99fce37744b4dacaff1b9967636661697c92b352e44"} Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.520938 4744 scope.go:117] "RemoveContainer" containerID="0afb5945cea3f16b569bf5cd91c0637c21625d8a5e8c9a149d03ffe4f5212776" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.521143 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.614223 4744 scope.go:117] "RemoveContainer" containerID="ef91adae48919d11880208d226567f3646d3ea892a030e49c3ec23d0025a918a" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.624207 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.632679 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.633498 4744 scope.go:117] "RemoveContainer" containerID="0afb5945cea3f16b569bf5cd91c0637c21625d8a5e8c9a149d03ffe4f5212776" Dec 05 20:34:31 crc kubenswrapper[4744]: E1205 20:34:31.634944 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0afb5945cea3f16b569bf5cd91c0637c21625d8a5e8c9a149d03ffe4f5212776\": container with ID starting with 0afb5945cea3f16b569bf5cd91c0637c21625d8a5e8c9a149d03ffe4f5212776 not found: ID does not exist" containerID="0afb5945cea3f16b569bf5cd91c0637c21625d8a5e8c9a149d03ffe4f5212776" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.634975 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0afb5945cea3f16b569bf5cd91c0637c21625d8a5e8c9a149d03ffe4f5212776"} err="failed to get container status \"0afb5945cea3f16b569bf5cd91c0637c21625d8a5e8c9a149d03ffe4f5212776\": rpc error: code = NotFound desc = could not find container \"0afb5945cea3f16b569bf5cd91c0637c21625d8a5e8c9a149d03ffe4f5212776\": container with ID starting with 0afb5945cea3f16b569bf5cd91c0637c21625d8a5e8c9a149d03ffe4f5212776 not found: ID does not exist" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.634995 4744 scope.go:117] "RemoveContainer" containerID="ef91adae48919d11880208d226567f3646d3ea892a030e49c3ec23d0025a918a" Dec 05 20:34:31 crc kubenswrapper[4744]: E1205 20:34:31.635323 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef91adae48919d11880208d226567f3646d3ea892a030e49c3ec23d0025a918a\": container with ID starting with ef91adae48919d11880208d226567f3646d3ea892a030e49c3ec23d0025a918a not found: ID does not exist" containerID="ef91adae48919d11880208d226567f3646d3ea892a030e49c3ec23d0025a918a" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.635348 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef91adae48919d11880208d226567f3646d3ea892a030e49c3ec23d0025a918a"} err="failed to get container status \"ef91adae48919d11880208d226567f3646d3ea892a030e49c3ec23d0025a918a\": rpc error: code = NotFound desc = could not find container \"ef91adae48919d11880208d226567f3646d3ea892a030e49c3ec23d0025a918a\": container with ID starting with ef91adae48919d11880208d226567f3646d3ea892a030e49c3ec23d0025a918a not found: ID does not exist" Dec 05 20:34:31 crc kubenswrapper[4744]: E1205 20:34:31.949771 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89196f127391c2d73c3ba9af37fe90466ac24afbda421da1d9959118c97d9bbf" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:34:31 crc kubenswrapper[4744]: E1205 20:34:31.951073 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89196f127391c2d73c3ba9af37fe90466ac24afbda421da1d9959118c97d9bbf" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:34:31 crc kubenswrapper[4744]: E1205 20:34:31.954568 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89196f127391c2d73c3ba9af37fe90466ac24afbda421da1d9959118c97d9bbf" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:34:31 crc kubenswrapper[4744]: E1205 20:34:31.954630 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="33dfed55-19a0-4b9e-b75e-f7ba3d14efb4" containerName="watcher-applier" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.966258 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:31 crc kubenswrapper[4744]: I1205 20:34:31.970647 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.094727 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d08f844-648a-40f2-b722-25f3a8c4e502" path="/var/lib/kubelet/pods/8d08f844-648a-40f2-b722-25f3a8c4e502/volumes" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.097501 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/770c8bdc-9e6e-450b-a6c9-d579cada809b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "770c8bdc-9e6e-450b-a6c9-d579cada809b" (UID: "770c8bdc-9e6e-450b-a6c9-d579cada809b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.097531 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/770c8bdc-9e6e-450b-a6c9-d579cada809b-log-httpd\") pod \"770c8bdc-9e6e-450b-a6c9-d579cada809b\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.097661 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-combined-ca-bundle\") pod \"d99c95d4-70dd-484d-9fca-ca76589af16e\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.098142 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-sg-core-conf-yaml\") pod \"770c8bdc-9e6e-450b-a6c9-d579cada809b\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.098190 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-combined-ca-bundle\") pod \"770c8bdc-9e6e-450b-a6c9-d579cada809b\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.098275 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d99c95d4-70dd-484d-9fca-ca76589af16e-logs\") pod \"d99c95d4-70dd-484d-9fca-ca76589af16e\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.098383 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt8pz\" (UniqueName: \"kubernetes.io/projected/770c8bdc-9e6e-450b-a6c9-d579cada809b-kube-api-access-xt8pz\") pod \"770c8bdc-9e6e-450b-a6c9-d579cada809b\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.098435 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-scripts\") pod \"770c8bdc-9e6e-450b-a6c9-d579cada809b\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.098468 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-config-data\") pod \"d99c95d4-70dd-484d-9fca-ca76589af16e\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.098518 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/770c8bdc-9e6e-450b-a6c9-d579cada809b-run-httpd\") pod \"770c8bdc-9e6e-450b-a6c9-d579cada809b\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.098541 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-ceilometer-tls-certs\") pod \"770c8bdc-9e6e-450b-a6c9-d579cada809b\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.098663 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d99c95d4-70dd-484d-9fca-ca76589af16e-logs" (OuterVolumeSpecName: "logs") pod "d99c95d4-70dd-484d-9fca-ca76589af16e" (UID: "d99c95d4-70dd-484d-9fca-ca76589af16e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.098693 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qx2k\" (UniqueName: \"kubernetes.io/projected/d99c95d4-70dd-484d-9fca-ca76589af16e-kube-api-access-5qx2k\") pod \"d99c95d4-70dd-484d-9fca-ca76589af16e\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.098809 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-config-data\") pod \"770c8bdc-9e6e-450b-a6c9-d579cada809b\" (UID: \"770c8bdc-9e6e-450b-a6c9-d579cada809b\") " Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.098856 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-custom-prometheus-ca\") pod \"d99c95d4-70dd-484d-9fca-ca76589af16e\" (UID: \"d99c95d4-70dd-484d-9fca-ca76589af16e\") " Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.099449 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d99c95d4-70dd-484d-9fca-ca76589af16e-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.099473 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/770c8bdc-9e6e-450b-a6c9-d579cada809b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.099681 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/770c8bdc-9e6e-450b-a6c9-d579cada809b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "770c8bdc-9e6e-450b-a6c9-d579cada809b" (UID: "770c8bdc-9e6e-450b-a6c9-d579cada809b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.104149 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d99c95d4-70dd-484d-9fca-ca76589af16e-kube-api-access-5qx2k" (OuterVolumeSpecName: "kube-api-access-5qx2k") pod "d99c95d4-70dd-484d-9fca-ca76589af16e" (UID: "d99c95d4-70dd-484d-9fca-ca76589af16e"). InnerVolumeSpecName "kube-api-access-5qx2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.106488 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-scripts" (OuterVolumeSpecName: "scripts") pod "770c8bdc-9e6e-450b-a6c9-d579cada809b" (UID: "770c8bdc-9e6e-450b-a6c9-d579cada809b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.118702 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/770c8bdc-9e6e-450b-a6c9-d579cada809b-kube-api-access-xt8pz" (OuterVolumeSpecName: "kube-api-access-xt8pz") pod "770c8bdc-9e6e-450b-a6c9-d579cada809b" (UID: "770c8bdc-9e6e-450b-a6c9-d579cada809b"). InnerVolumeSpecName "kube-api-access-xt8pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.124025 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "d99c95d4-70dd-484d-9fca-ca76589af16e" (UID: "d99c95d4-70dd-484d-9fca-ca76589af16e"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.131988 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d99c95d4-70dd-484d-9fca-ca76589af16e" (UID: "d99c95d4-70dd-484d-9fca-ca76589af16e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.157170 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "770c8bdc-9e6e-450b-a6c9-d579cada809b" (UID: "770c8bdc-9e6e-450b-a6c9-d579cada809b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.189074 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "770c8bdc-9e6e-450b-a6c9-d579cada809b" (UID: "770c8bdc-9e6e-450b-a6c9-d579cada809b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.194011 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "770c8bdc-9e6e-450b-a6c9-d579cada809b" (UID: "770c8bdc-9e6e-450b-a6c9-d579cada809b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.195981 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-config-data" (OuterVolumeSpecName: "config-data") pod "d99c95d4-70dd-484d-9fca-ca76589af16e" (UID: "d99c95d4-70dd-484d-9fca-ca76589af16e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.201056 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.201265 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.201373 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.201549 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.201634 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt8pz\" (UniqueName: \"kubernetes.io/projected/770c8bdc-9e6e-450b-a6c9-d579cada809b-kube-api-access-xt8pz\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.201708 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.201783 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d99c95d4-70dd-484d-9fca-ca76589af16e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.201874 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/770c8bdc-9e6e-450b-a6c9-d579cada809b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.202177 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.202264 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qx2k\" (UniqueName: \"kubernetes.io/projected/d99c95d4-70dd-484d-9fca-ca76589af16e-kube-api-access-5qx2k\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.227480 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-config-data" (OuterVolumeSpecName: "config-data") pod "770c8bdc-9e6e-450b-a6c9-d579cada809b" (UID: "770c8bdc-9e6e-450b-a6c9-d579cada809b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.303623 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770c8bdc-9e6e-450b-a6c9-d579cada809b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.531768 4744 generic.go:334] "Generic (PLEG): container finished" podID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerID="b599959ed856ff31b18a3034f0429e63cbc931ade4366d9c691eaf7ae51e3d66" exitCode=0 Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.531835 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"770c8bdc-9e6e-450b-a6c9-d579cada809b","Type":"ContainerDied","Data":"b599959ed856ff31b18a3034f0429e63cbc931ade4366d9c691eaf7ae51e3d66"} Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.531866 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"770c8bdc-9e6e-450b-a6c9-d579cada809b","Type":"ContainerDied","Data":"3a5f5f05cb0c01943951921bcc963d8a2e47fbf1e6363adb5e0831aa0ff4d834"} Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.531885 4744 scope.go:117] "RemoveContainer" containerID="ba6689546e420622861c98ea4cd2e7023f02cc25403ad90b0fb61b62f2f25ae8" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.532001 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.536940 4744 generic.go:334] "Generic (PLEG): container finished" podID="d99c95d4-70dd-484d-9fca-ca76589af16e" containerID="2fc2ecf6fbf16d6670d9de5e228ddb4b018ce807ec347dfaf61e96dfd4e5bf31" exitCode=0 Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.537035 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"d99c95d4-70dd-484d-9fca-ca76589af16e","Type":"ContainerDied","Data":"2fc2ecf6fbf16d6670d9de5e228ddb4b018ce807ec347dfaf61e96dfd4e5bf31"} Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.537061 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"d99c95d4-70dd-484d-9fca-ca76589af16e","Type":"ContainerDied","Data":"f28d3a4721e33a0f7cb46ad42d5b1b4a08a0a6c8ed71ba46f11efbf63b7fb9fc"} Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.537118 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.605404 4744 scope.go:117] "RemoveContainer" containerID="2cd0c79b6cc1c0c63d5ceeb4bb0a03f02155f46079799783f7d3bf06d69d600d" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.625974 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.628727 4744 scope.go:117] "RemoveContainer" containerID="b599959ed856ff31b18a3034f0429e63cbc931ade4366d9c691eaf7ae51e3d66" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.645327 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.660458 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.663749 4744 scope.go:117] "RemoveContainer" containerID="1f08ed016ad9601e7551caa7e36f5e0a2bb6aad562b28fe994bcd75cc65506b8" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.669838 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:34:32 crc kubenswrapper[4744]: E1205 20:34:32.670410 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="ceilometer-central-agent" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.670429 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="ceilometer-central-agent" Dec 05 20:34:32 crc kubenswrapper[4744]: E1205 20:34:32.670442 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d99c95d4-70dd-484d-9fca-ca76589af16e" containerName="watcher-decision-engine" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.670449 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d99c95d4-70dd-484d-9fca-ca76589af16e" containerName="watcher-decision-engine" Dec 05 20:34:32 crc kubenswrapper[4744]: E1205 20:34:32.670460 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="sg-core" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.670469 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="sg-core" Dec 05 20:34:32 crc kubenswrapper[4744]: E1205 20:34:32.670491 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d08f844-648a-40f2-b722-25f3a8c4e502" containerName="watcher-kuttl-api-log" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.670499 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d08f844-648a-40f2-b722-25f3a8c4e502" containerName="watcher-kuttl-api-log" Dec 05 20:34:32 crc kubenswrapper[4744]: E1205 20:34:32.670523 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="proxy-httpd" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.670530 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="proxy-httpd" Dec 05 20:34:32 crc kubenswrapper[4744]: E1205 20:34:32.670544 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d08f844-648a-40f2-b722-25f3a8c4e502" containerName="watcher-api" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.670552 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d08f844-648a-40f2-b722-25f3a8c4e502" containerName="watcher-api" Dec 05 20:34:32 crc kubenswrapper[4744]: E1205 20:34:32.670568 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="ceilometer-notification-agent" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.670576 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="ceilometer-notification-agent" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.670827 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="ceilometer-central-agent" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.670857 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="ceilometer-notification-agent" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.670873 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="sg-core" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.670885 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d08f844-648a-40f2-b722-25f3a8c4e502" containerName="watcher-api" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.670909 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" containerName="proxy-httpd" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.670922 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d99c95d4-70dd-484d-9fca-ca76589af16e" containerName="watcher-decision-engine" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.670934 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d08f844-648a-40f2-b722-25f3a8c4e502" containerName="watcher-kuttl-api-log" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.674414 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.679641 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.679820 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.679981 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.684509 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.691048 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.717975 4744 scope.go:117] "RemoveContainer" containerID="ba6689546e420622861c98ea4cd2e7023f02cc25403ad90b0fb61b62f2f25ae8" Dec 05 20:34:32 crc kubenswrapper[4744]: E1205 20:34:32.718744 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba6689546e420622861c98ea4cd2e7023f02cc25403ad90b0fb61b62f2f25ae8\": container with ID starting with ba6689546e420622861c98ea4cd2e7023f02cc25403ad90b0fb61b62f2f25ae8 not found: ID does not exist" containerID="ba6689546e420622861c98ea4cd2e7023f02cc25403ad90b0fb61b62f2f25ae8" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.718779 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba6689546e420622861c98ea4cd2e7023f02cc25403ad90b0fb61b62f2f25ae8"} err="failed to get container status \"ba6689546e420622861c98ea4cd2e7023f02cc25403ad90b0fb61b62f2f25ae8\": rpc error: code = NotFound desc = could not find container \"ba6689546e420622861c98ea4cd2e7023f02cc25403ad90b0fb61b62f2f25ae8\": container with ID starting with ba6689546e420622861c98ea4cd2e7023f02cc25403ad90b0fb61b62f2f25ae8 not found: ID does not exist" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.718809 4744 scope.go:117] "RemoveContainer" containerID="2cd0c79b6cc1c0c63d5ceeb4bb0a03f02155f46079799783f7d3bf06d69d600d" Dec 05 20:34:32 crc kubenswrapper[4744]: E1205 20:34:32.720040 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cd0c79b6cc1c0c63d5ceeb4bb0a03f02155f46079799783f7d3bf06d69d600d\": container with ID starting with 2cd0c79b6cc1c0c63d5ceeb4bb0a03f02155f46079799783f7d3bf06d69d600d not found: ID does not exist" containerID="2cd0c79b6cc1c0c63d5ceeb4bb0a03f02155f46079799783f7d3bf06d69d600d" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.720074 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd0c79b6cc1c0c63d5ceeb4bb0a03f02155f46079799783f7d3bf06d69d600d"} err="failed to get container status \"2cd0c79b6cc1c0c63d5ceeb4bb0a03f02155f46079799783f7d3bf06d69d600d\": rpc error: code = NotFound desc = could not find container \"2cd0c79b6cc1c0c63d5ceeb4bb0a03f02155f46079799783f7d3bf06d69d600d\": container with ID starting with 2cd0c79b6cc1c0c63d5ceeb4bb0a03f02155f46079799783f7d3bf06d69d600d not found: ID does not exist" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.720113 4744 scope.go:117] "RemoveContainer" containerID="b599959ed856ff31b18a3034f0429e63cbc931ade4366d9c691eaf7ae51e3d66" Dec 05 20:34:32 crc kubenswrapper[4744]: E1205 20:34:32.720334 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b599959ed856ff31b18a3034f0429e63cbc931ade4366d9c691eaf7ae51e3d66\": container with ID starting with b599959ed856ff31b18a3034f0429e63cbc931ade4366d9c691eaf7ae51e3d66 not found: ID does not exist" containerID="b599959ed856ff31b18a3034f0429e63cbc931ade4366d9c691eaf7ae51e3d66" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.720357 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b599959ed856ff31b18a3034f0429e63cbc931ade4366d9c691eaf7ae51e3d66"} err="failed to get container status \"b599959ed856ff31b18a3034f0429e63cbc931ade4366d9c691eaf7ae51e3d66\": rpc error: code = NotFound desc = could not find container \"b599959ed856ff31b18a3034f0429e63cbc931ade4366d9c691eaf7ae51e3d66\": container with ID starting with b599959ed856ff31b18a3034f0429e63cbc931ade4366d9c691eaf7ae51e3d66 not found: ID does not exist" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.720373 4744 scope.go:117] "RemoveContainer" containerID="1f08ed016ad9601e7551caa7e36f5e0a2bb6aad562b28fe994bcd75cc65506b8" Dec 05 20:34:32 crc kubenswrapper[4744]: E1205 20:34:32.720581 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f08ed016ad9601e7551caa7e36f5e0a2bb6aad562b28fe994bcd75cc65506b8\": container with ID starting with 1f08ed016ad9601e7551caa7e36f5e0a2bb6aad562b28fe994bcd75cc65506b8 not found: ID does not exist" containerID="1f08ed016ad9601e7551caa7e36f5e0a2bb6aad562b28fe994bcd75cc65506b8" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.720601 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f08ed016ad9601e7551caa7e36f5e0a2bb6aad562b28fe994bcd75cc65506b8"} err="failed to get container status \"1f08ed016ad9601e7551caa7e36f5e0a2bb6aad562b28fe994bcd75cc65506b8\": rpc error: code = NotFound desc = could not find container \"1f08ed016ad9601e7551caa7e36f5e0a2bb6aad562b28fe994bcd75cc65506b8\": container with ID starting with 1f08ed016ad9601e7551caa7e36f5e0a2bb6aad562b28fe994bcd75cc65506b8 not found: ID does not exist" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.720615 4744 scope.go:117] "RemoveContainer" containerID="2fc2ecf6fbf16d6670d9de5e228ddb4b018ce807ec347dfaf61e96dfd4e5bf31" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.744511 4744 scope.go:117] "RemoveContainer" containerID="2fc2ecf6fbf16d6670d9de5e228ddb4b018ce807ec347dfaf61e96dfd4e5bf31" Dec 05 20:34:32 crc kubenswrapper[4744]: E1205 20:34:32.749591 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc2ecf6fbf16d6670d9de5e228ddb4b018ce807ec347dfaf61e96dfd4e5bf31\": container with ID starting with 2fc2ecf6fbf16d6670d9de5e228ddb4b018ce807ec347dfaf61e96dfd4e5bf31 not found: ID does not exist" containerID="2fc2ecf6fbf16d6670d9de5e228ddb4b018ce807ec347dfaf61e96dfd4e5bf31" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.749665 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc2ecf6fbf16d6670d9de5e228ddb4b018ce807ec347dfaf61e96dfd4e5bf31"} err="failed to get container status \"2fc2ecf6fbf16d6670d9de5e228ddb4b018ce807ec347dfaf61e96dfd4e5bf31\": rpc error: code = NotFound desc = could not find container \"2fc2ecf6fbf16d6670d9de5e228ddb4b018ce807ec347dfaf61e96dfd4e5bf31\": container with ID starting with 2fc2ecf6fbf16d6670d9de5e228ddb4b018ce807ec347dfaf61e96dfd4e5bf31 not found: ID does not exist" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.814029 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-scripts\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.814080 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1449ab6-9c86-4811-93c8-04f45c613128-run-httpd\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.814147 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vnx8\" (UniqueName: \"kubernetes.io/projected/b1449ab6-9c86-4811-93c8-04f45c613128-kube-api-access-2vnx8\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.814221 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-config-data\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.814276 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.814322 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.814449 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.814570 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1449ab6-9c86-4811-93c8-04f45c613128-log-httpd\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.916818 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.916871 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.916912 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.916948 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1449ab6-9c86-4811-93c8-04f45c613128-log-httpd\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.916985 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-scripts\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.917009 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1449ab6-9c86-4811-93c8-04f45c613128-run-httpd\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.917095 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vnx8\" (UniqueName: \"kubernetes.io/projected/b1449ab6-9c86-4811-93c8-04f45c613128-kube-api-access-2vnx8\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.917128 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-config-data\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.917574 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1449ab6-9c86-4811-93c8-04f45c613128-log-httpd\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.917649 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1449ab6-9c86-4811-93c8-04f45c613128-run-httpd\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.922418 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.925662 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-scripts\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.925736 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.928320 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.934495 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vnx8\" (UniqueName: \"kubernetes.io/projected/b1449ab6-9c86-4811-93c8-04f45c613128-kube-api-access-2vnx8\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:32 crc kubenswrapper[4744]: I1205 20:34:32.935605 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-config-data\") pod \"ceilometer-0\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:33 crc kubenswrapper[4744]: I1205 20:34:33.006425 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher43c1-account-delete-66tz9" Dec 05 20:34:33 crc kubenswrapper[4744]: I1205 20:34:33.009022 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:33 crc kubenswrapper[4744]: I1205 20:34:33.119116 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fk8p\" (UniqueName: \"kubernetes.io/projected/bc65c044-7e38-4173-9128-fe84d1866ae7-kube-api-access-7fk8p\") pod \"bc65c044-7e38-4173-9128-fe84d1866ae7\" (UID: \"bc65c044-7e38-4173-9128-fe84d1866ae7\") " Dec 05 20:34:33 crc kubenswrapper[4744]: I1205 20:34:33.119420 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc65c044-7e38-4173-9128-fe84d1866ae7-operator-scripts\") pod \"bc65c044-7e38-4173-9128-fe84d1866ae7\" (UID: \"bc65c044-7e38-4173-9128-fe84d1866ae7\") " Dec 05 20:34:33 crc kubenswrapper[4744]: I1205 20:34:33.120583 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc65c044-7e38-4173-9128-fe84d1866ae7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc65c044-7e38-4173-9128-fe84d1866ae7" (UID: "bc65c044-7e38-4173-9128-fe84d1866ae7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:33 crc kubenswrapper[4744]: I1205 20:34:33.125659 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc65c044-7e38-4173-9128-fe84d1866ae7-kube-api-access-7fk8p" (OuterVolumeSpecName: "kube-api-access-7fk8p") pod "bc65c044-7e38-4173-9128-fe84d1866ae7" (UID: "bc65c044-7e38-4173-9128-fe84d1866ae7"). InnerVolumeSpecName "kube-api-access-7fk8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:33 crc kubenswrapper[4744]: I1205 20:34:33.222086 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fk8p\" (UniqueName: \"kubernetes.io/projected/bc65c044-7e38-4173-9128-fe84d1866ae7-kube-api-access-7fk8p\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:33 crc kubenswrapper[4744]: I1205 20:34:33.222127 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc65c044-7e38-4173-9128-fe84d1866ae7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:33 crc kubenswrapper[4744]: I1205 20:34:33.463377 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:34:33 crc kubenswrapper[4744]: I1205 20:34:33.564650 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:34:33 crc kubenswrapper[4744]: I1205 20:34:33.565897 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher43c1-account-delete-66tz9" event={"ID":"bc65c044-7e38-4173-9128-fe84d1866ae7","Type":"ContainerDied","Data":"df9134c27807409dd87816e09aac0bc27def71221adfe6475b2aa791f542e531"} Dec 05 20:34:33 crc kubenswrapper[4744]: I1205 20:34:33.565941 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df9134c27807409dd87816e09aac0bc27def71221adfe6475b2aa791f542e531" Dec 05 20:34:33 crc kubenswrapper[4744]: I1205 20:34:33.565992 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher43c1-account-delete-66tz9" Dec 05 20:34:33 crc kubenswrapper[4744]: I1205 20:34:33.574922 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1449ab6-9c86-4811-93c8-04f45c613128","Type":"ContainerStarted","Data":"e9de6b958aa497a6e95dd5372c4da154b34159707fe8e261ed6cb2a8ebaf3860"} Dec 05 20:34:34 crc kubenswrapper[4744]: I1205 20:34:34.093200 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="770c8bdc-9e6e-450b-a6c9-d579cada809b" path="/var/lib/kubelet/pods/770c8bdc-9e6e-450b-a6c9-d579cada809b/volumes" Dec 05 20:34:34 crc kubenswrapper[4744]: I1205 20:34:34.094246 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d99c95d4-70dd-484d-9fca-ca76589af16e" path="/var/lib/kubelet/pods/d99c95d4-70dd-484d-9fca-ca76589af16e/volumes" Dec 05 20:34:34 crc kubenswrapper[4744]: E1205 20:34:34.101545 4744 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/ca723c16538e96e7280a1c7361f7e632187828784eaf84c7139f556948c3ef57/diff" to get inode usage: stat /var/lib/containers/storage/overlay/ca723c16538e96e7280a1c7361f7e632187828784eaf84c7139f556948c3ef57/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/watcher-kuttl-default_ceilometer-0_770c8bdc-9e6e-450b-a6c9-d579cada809b/ceilometer-notification-agent/0.log" to get inode usage: stat /var/log/pods/watcher-kuttl-default_ceilometer-0_770c8bdc-9e6e-450b-a6c9-d579cada809b/ceilometer-notification-agent/0.log: no such file or directory Dec 05 20:34:34 crc kubenswrapper[4744]: I1205 20:34:34.593430 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1449ab6-9c86-4811-93c8-04f45c613128","Type":"ContainerStarted","Data":"4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d"} Dec 05 20:34:34 crc kubenswrapper[4744]: I1205 20:34:34.999076 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-5wppb"] Dec 05 20:34:35 crc kubenswrapper[4744]: I1205 20:34:35.008869 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-5wppb"] Dec 05 20:34:35 crc kubenswrapper[4744]: I1205 20:34:35.022752 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw"] Dec 05 20:34:35 crc kubenswrapper[4744]: I1205 20:34:35.030460 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-43c1-account-create-update-kvtmw"] Dec 05 20:34:35 crc kubenswrapper[4744]: I1205 20:34:35.038346 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher43c1-account-delete-66tz9"] Dec 05 20:34:35 crc kubenswrapper[4744]: I1205 20:34:35.044120 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher43c1-account-delete-66tz9"] Dec 05 20:34:35 crc kubenswrapper[4744]: I1205 20:34:35.648448 4744 generic.go:334] "Generic (PLEG): container finished" podID="33dfed55-19a0-4b9e-b75e-f7ba3d14efb4" containerID="89196f127391c2d73c3ba9af37fe90466ac24afbda421da1d9959118c97d9bbf" exitCode=0 Dec 05 20:34:35 crc kubenswrapper[4744]: I1205 20:34:35.648631 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4","Type":"ContainerDied","Data":"89196f127391c2d73c3ba9af37fe90466ac24afbda421da1d9959118c97d9bbf"} Dec 05 20:34:35 crc kubenswrapper[4744]: I1205 20:34:35.656453 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1449ab6-9c86-4811-93c8-04f45c613128","Type":"ContainerStarted","Data":"2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd"} Dec 05 20:34:35 crc kubenswrapper[4744]: I1205 20:34:35.818167 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:35 crc kubenswrapper[4744]: I1205 20:34:35.964198 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-config-data\") pod \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " Dec 05 20:34:35 crc kubenswrapper[4744]: I1205 20:34:35.964267 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlfjp\" (UniqueName: \"kubernetes.io/projected/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-kube-api-access-wlfjp\") pod \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " Dec 05 20:34:35 crc kubenswrapper[4744]: I1205 20:34:35.964337 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-combined-ca-bundle\") pod \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " Dec 05 20:34:35 crc kubenswrapper[4744]: I1205 20:34:35.964386 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-logs\") pod \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\" (UID: \"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4\") " Dec 05 20:34:35 crc kubenswrapper[4744]: I1205 20:34:35.965177 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-logs" (OuterVolumeSpecName: "logs") pod "33dfed55-19a0-4b9e-b75e-f7ba3d14efb4" (UID: "33dfed55-19a0-4b9e-b75e-f7ba3d14efb4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:35 crc kubenswrapper[4744]: I1205 20:34:35.969469 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-kube-api-access-wlfjp" (OuterVolumeSpecName: "kube-api-access-wlfjp") pod "33dfed55-19a0-4b9e-b75e-f7ba3d14efb4" (UID: "33dfed55-19a0-4b9e-b75e-f7ba3d14efb4"). InnerVolumeSpecName "kube-api-access-wlfjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:36 crc kubenswrapper[4744]: I1205 20:34:36.011063 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-config-data" (OuterVolumeSpecName: "config-data") pod "33dfed55-19a0-4b9e-b75e-f7ba3d14efb4" (UID: "33dfed55-19a0-4b9e-b75e-f7ba3d14efb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:36 crc kubenswrapper[4744]: I1205 20:34:36.011361 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33dfed55-19a0-4b9e-b75e-f7ba3d14efb4" (UID: "33dfed55-19a0-4b9e-b75e-f7ba3d14efb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:36 crc kubenswrapper[4744]: I1205 20:34:36.066031 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:36 crc kubenswrapper[4744]: I1205 20:34:36.066070 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:36 crc kubenswrapper[4744]: I1205 20:34:36.066089 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlfjp\" (UniqueName: \"kubernetes.io/projected/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-kube-api-access-wlfjp\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:36 crc kubenswrapper[4744]: I1205 20:34:36.066105 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:36 crc kubenswrapper[4744]: I1205 20:34:36.090934 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e7c1e46-ec30-458e-8402-8fea6d2ed3b1" path="/var/lib/kubelet/pods/0e7c1e46-ec30-458e-8402-8fea6d2ed3b1/volumes" Dec 05 20:34:36 crc kubenswrapper[4744]: I1205 20:34:36.091512 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3631e57-e315-4db0-b144-49e90d81ce43" path="/var/lib/kubelet/pods/b3631e57-e315-4db0-b144-49e90d81ce43/volumes" Dec 05 20:34:36 crc kubenswrapper[4744]: I1205 20:34:36.092044 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc65c044-7e38-4173-9128-fe84d1866ae7" path="/var/lib/kubelet/pods/bc65c044-7e38-4173-9128-fe84d1866ae7/volumes" Dec 05 20:34:36 crc kubenswrapper[4744]: I1205 20:34:36.669450 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1449ab6-9c86-4811-93c8-04f45c613128","Type":"ContainerStarted","Data":"c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a"} Dec 05 20:34:36 crc kubenswrapper[4744]: I1205 20:34:36.672240 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"33dfed55-19a0-4b9e-b75e-f7ba3d14efb4","Type":"ContainerDied","Data":"00675d49bc6918b1944adb94eaf8e1d4e7431f6d887f5afbb15fc8619937c23d"} Dec 05 20:34:36 crc kubenswrapper[4744]: I1205 20:34:36.672322 4744 scope.go:117] "RemoveContainer" containerID="89196f127391c2d73c3ba9af37fe90466ac24afbda421da1d9959118c97d9bbf" Dec 05 20:34:36 crc kubenswrapper[4744]: I1205 20:34:36.672581 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:36 crc kubenswrapper[4744]: I1205 20:34:36.707096 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:34:36 crc kubenswrapper[4744]: I1205 20:34:36.715057 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:34:37 crc kubenswrapper[4744]: I1205 20:34:37.682490 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1449ab6-9c86-4811-93c8-04f45c613128","Type":"ContainerStarted","Data":"119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984"} Dec 05 20:34:37 crc kubenswrapper[4744]: I1205 20:34:37.682819 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="ceilometer-central-agent" containerID="cri-o://4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d" gracePeriod=30 Dec 05 20:34:37 crc kubenswrapper[4744]: I1205 20:34:37.682876 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:37 crc kubenswrapper[4744]: I1205 20:34:37.682936 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="proxy-httpd" containerID="cri-o://119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984" gracePeriod=30 Dec 05 20:34:37 crc kubenswrapper[4744]: I1205 20:34:37.682987 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="sg-core" containerID="cri-o://c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a" gracePeriod=30 Dec 05 20:34:37 crc kubenswrapper[4744]: I1205 20:34:37.683022 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="ceilometer-notification-agent" containerID="cri-o://2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd" gracePeriod=30 Dec 05 20:34:37 crc kubenswrapper[4744]: I1205 20:34:37.713869 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.604208677 podStartE2EDuration="5.713853073s" podCreationTimestamp="2025-12-05 20:34:32 +0000 UTC" firstStartedPulling="2025-12-05 20:34:33.471279089 +0000 UTC m=+1443.701090457" lastFinishedPulling="2025-12-05 20:34:36.580923485 +0000 UTC m=+1446.810734853" observedRunningTime="2025-12-05 20:34:37.707618529 +0000 UTC m=+1447.937429907" watchObservedRunningTime="2025-12-05 20:34:37.713853073 +0000 UTC m=+1447.943664431" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.107918 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33dfed55-19a0-4b9e-b75e-f7ba3d14efb4" path="/var/lib/kubelet/pods/33dfed55-19a0-4b9e-b75e-f7ba3d14efb4/volumes" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.114002 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-5ht55"] Dec 05 20:34:38 crc kubenswrapper[4744]: E1205 20:34:38.114633 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc65c044-7e38-4173-9128-fe84d1866ae7" containerName="mariadb-account-delete" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.114751 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc65c044-7e38-4173-9128-fe84d1866ae7" containerName="mariadb-account-delete" Dec 05 20:34:38 crc kubenswrapper[4744]: E1205 20:34:38.114831 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33dfed55-19a0-4b9e-b75e-f7ba3d14efb4" containerName="watcher-applier" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.114896 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="33dfed55-19a0-4b9e-b75e-f7ba3d14efb4" containerName="watcher-applier" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.115178 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc65c044-7e38-4173-9128-fe84d1866ae7" containerName="mariadb-account-delete" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.115276 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="33dfed55-19a0-4b9e-b75e-f7ba3d14efb4" containerName="watcher-applier" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.116076 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-5ht55" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.123999 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-5ht55"] Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.211629 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2ljz\" (UniqueName: \"kubernetes.io/projected/bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e-kube-api-access-m2ljz\") pod \"watcher-db-create-5ht55\" (UID: \"bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e\") " pod="watcher-kuttl-default/watcher-db-create-5ht55" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.211721 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e-operator-scripts\") pod \"watcher-db-create-5ht55\" (UID: \"bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e\") " pod="watcher-kuttl-default/watcher-db-create-5ht55" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.223619 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-489c-account-create-update-hh5w9"] Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.224953 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-489c-account-create-update-hh5w9" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.227535 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.232567 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-489c-account-create-update-hh5w9"] Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.313613 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2ljz\" (UniqueName: \"kubernetes.io/projected/bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e-kube-api-access-m2ljz\") pod \"watcher-db-create-5ht55\" (UID: \"bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e\") " pod="watcher-kuttl-default/watcher-db-create-5ht55" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.313691 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e-operator-scripts\") pod \"watcher-db-create-5ht55\" (UID: \"bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e\") " pod="watcher-kuttl-default/watcher-db-create-5ht55" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.314659 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e-operator-scripts\") pod \"watcher-db-create-5ht55\" (UID: \"bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e\") " pod="watcher-kuttl-default/watcher-db-create-5ht55" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.339211 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2ljz\" (UniqueName: \"kubernetes.io/projected/bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e-kube-api-access-m2ljz\") pod \"watcher-db-create-5ht55\" (UID: \"bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e\") " pod="watcher-kuttl-default/watcher-db-create-5ht55" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.422350 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65lx2\" (UniqueName: \"kubernetes.io/projected/783b1f69-ef7e-4a6e-8bc6-27efb86e6fca-kube-api-access-65lx2\") pod \"watcher-489c-account-create-update-hh5w9\" (UID: \"783b1f69-ef7e-4a6e-8bc6-27efb86e6fca\") " pod="watcher-kuttl-default/watcher-489c-account-create-update-hh5w9" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.422444 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/783b1f69-ef7e-4a6e-8bc6-27efb86e6fca-operator-scripts\") pod \"watcher-489c-account-create-update-hh5w9\" (UID: \"783b1f69-ef7e-4a6e-8bc6-27efb86e6fca\") " pod="watcher-kuttl-default/watcher-489c-account-create-update-hh5w9" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.523741 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65lx2\" (UniqueName: \"kubernetes.io/projected/783b1f69-ef7e-4a6e-8bc6-27efb86e6fca-kube-api-access-65lx2\") pod \"watcher-489c-account-create-update-hh5w9\" (UID: \"783b1f69-ef7e-4a6e-8bc6-27efb86e6fca\") " pod="watcher-kuttl-default/watcher-489c-account-create-update-hh5w9" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.523856 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/783b1f69-ef7e-4a6e-8bc6-27efb86e6fca-operator-scripts\") pod \"watcher-489c-account-create-update-hh5w9\" (UID: \"783b1f69-ef7e-4a6e-8bc6-27efb86e6fca\") " pod="watcher-kuttl-default/watcher-489c-account-create-update-hh5w9" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.524748 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/783b1f69-ef7e-4a6e-8bc6-27efb86e6fca-operator-scripts\") pod \"watcher-489c-account-create-update-hh5w9\" (UID: \"783b1f69-ef7e-4a6e-8bc6-27efb86e6fca\") " pod="watcher-kuttl-default/watcher-489c-account-create-update-hh5w9" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.529282 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.542546 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65lx2\" (UniqueName: \"kubernetes.io/projected/783b1f69-ef7e-4a6e-8bc6-27efb86e6fca-kube-api-access-65lx2\") pod \"watcher-489c-account-create-update-hh5w9\" (UID: \"783b1f69-ef7e-4a6e-8bc6-27efb86e6fca\") " pod="watcher-kuttl-default/watcher-489c-account-create-update-hh5w9" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.550784 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-5ht55" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.561651 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-489c-account-create-update-hh5w9" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.700843 4744 generic.go:334] "Generic (PLEG): container finished" podID="b1449ab6-9c86-4811-93c8-04f45c613128" containerID="119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984" exitCode=0 Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.700877 4744 generic.go:334] "Generic (PLEG): container finished" podID="b1449ab6-9c86-4811-93c8-04f45c613128" containerID="c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a" exitCode=2 Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.700886 4744 generic.go:334] "Generic (PLEG): container finished" podID="b1449ab6-9c86-4811-93c8-04f45c613128" containerID="2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd" exitCode=0 Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.700896 4744 generic.go:334] "Generic (PLEG): container finished" podID="b1449ab6-9c86-4811-93c8-04f45c613128" containerID="4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d" exitCode=0 Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.700920 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1449ab6-9c86-4811-93c8-04f45c613128","Type":"ContainerDied","Data":"119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984"} Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.700949 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1449ab6-9c86-4811-93c8-04f45c613128","Type":"ContainerDied","Data":"c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a"} Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.700962 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1449ab6-9c86-4811-93c8-04f45c613128","Type":"ContainerDied","Data":"2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd"} Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.700974 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1449ab6-9c86-4811-93c8-04f45c613128","Type":"ContainerDied","Data":"4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d"} Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.700985 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1449ab6-9c86-4811-93c8-04f45c613128","Type":"ContainerDied","Data":"e9de6b958aa497a6e95dd5372c4da154b34159707fe8e261ed6cb2a8ebaf3860"} Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.701003 4744 scope.go:117] "RemoveContainer" containerID="119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.701144 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.727801 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-scripts\") pod \"b1449ab6-9c86-4811-93c8-04f45c613128\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.727871 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1449ab6-9c86-4811-93c8-04f45c613128-run-httpd\") pod \"b1449ab6-9c86-4811-93c8-04f45c613128\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.727901 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-combined-ca-bundle\") pod \"b1449ab6-9c86-4811-93c8-04f45c613128\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.727933 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-ceilometer-tls-certs\") pod \"b1449ab6-9c86-4811-93c8-04f45c613128\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.727958 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1449ab6-9c86-4811-93c8-04f45c613128-log-httpd\") pod \"b1449ab6-9c86-4811-93c8-04f45c613128\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.728014 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-config-data\") pod \"b1449ab6-9c86-4811-93c8-04f45c613128\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.728131 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-sg-core-conf-yaml\") pod \"b1449ab6-9c86-4811-93c8-04f45c613128\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.728149 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vnx8\" (UniqueName: \"kubernetes.io/projected/b1449ab6-9c86-4811-93c8-04f45c613128-kube-api-access-2vnx8\") pod \"b1449ab6-9c86-4811-93c8-04f45c613128\" (UID: \"b1449ab6-9c86-4811-93c8-04f45c613128\") " Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.729307 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1449ab6-9c86-4811-93c8-04f45c613128-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b1449ab6-9c86-4811-93c8-04f45c613128" (UID: "b1449ab6-9c86-4811-93c8-04f45c613128"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.733081 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1449ab6-9c86-4811-93c8-04f45c613128-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b1449ab6-9c86-4811-93c8-04f45c613128" (UID: "b1449ab6-9c86-4811-93c8-04f45c613128"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.733432 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-scripts" (OuterVolumeSpecName: "scripts") pod "b1449ab6-9c86-4811-93c8-04f45c613128" (UID: "b1449ab6-9c86-4811-93c8-04f45c613128"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.740166 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1449ab6-9c86-4811-93c8-04f45c613128-kube-api-access-2vnx8" (OuterVolumeSpecName: "kube-api-access-2vnx8") pod "b1449ab6-9c86-4811-93c8-04f45c613128" (UID: "b1449ab6-9c86-4811-93c8-04f45c613128"). InnerVolumeSpecName "kube-api-access-2vnx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.754355 4744 scope.go:117] "RemoveContainer" containerID="c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.772871 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b1449ab6-9c86-4811-93c8-04f45c613128" (UID: "b1449ab6-9c86-4811-93c8-04f45c613128"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.785413 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b1449ab6-9c86-4811-93c8-04f45c613128" (UID: "b1449ab6-9c86-4811-93c8-04f45c613128"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.810818 4744 scope.go:117] "RemoveContainer" containerID="2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.827155 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1449ab6-9c86-4811-93c8-04f45c613128" (UID: "b1449ab6-9c86-4811-93c8-04f45c613128"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.830272 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.830324 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vnx8\" (UniqueName: \"kubernetes.io/projected/b1449ab6-9c86-4811-93c8-04f45c613128-kube-api-access-2vnx8\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.830340 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.830351 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1449ab6-9c86-4811-93c8-04f45c613128-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.830361 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.830371 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.830381 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1449ab6-9c86-4811-93c8-04f45c613128-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.860486 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-config-data" (OuterVolumeSpecName: "config-data") pod "b1449ab6-9c86-4811-93c8-04f45c613128" (UID: "b1449ab6-9c86-4811-93c8-04f45c613128"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.917914 4744 scope.go:117] "RemoveContainer" containerID="4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.932204 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1449ab6-9c86-4811-93c8-04f45c613128-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.948157 4744 scope.go:117] "RemoveContainer" containerID="119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984" Dec 05 20:34:38 crc kubenswrapper[4744]: E1205 20:34:38.948868 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984\": container with ID starting with 119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984 not found: ID does not exist" containerID="119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.948894 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984"} err="failed to get container status \"119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984\": rpc error: code = NotFound desc = could not find container \"119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984\": container with ID starting with 119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984 not found: ID does not exist" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.948923 4744 scope.go:117] "RemoveContainer" containerID="c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a" Dec 05 20:34:38 crc kubenswrapper[4744]: E1205 20:34:38.949163 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a\": container with ID starting with c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a not found: ID does not exist" containerID="c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.949197 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a"} err="failed to get container status \"c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a\": rpc error: code = NotFound desc = could not find container \"c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a\": container with ID starting with c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a not found: ID does not exist" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.949224 4744 scope.go:117] "RemoveContainer" containerID="2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd" Dec 05 20:34:38 crc kubenswrapper[4744]: E1205 20:34:38.949682 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd\": container with ID starting with 2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd not found: ID does not exist" containerID="2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.949705 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd"} err="failed to get container status \"2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd\": rpc error: code = NotFound desc = could not find container \"2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd\": container with ID starting with 2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd not found: ID does not exist" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.949719 4744 scope.go:117] "RemoveContainer" containerID="4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d" Dec 05 20:34:38 crc kubenswrapper[4744]: E1205 20:34:38.949883 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d\": container with ID starting with 4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d not found: ID does not exist" containerID="4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.949904 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d"} err="failed to get container status \"4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d\": rpc error: code = NotFound desc = could not find container \"4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d\": container with ID starting with 4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d not found: ID does not exist" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.949917 4744 scope.go:117] "RemoveContainer" containerID="119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.950070 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984"} err="failed to get container status \"119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984\": rpc error: code = NotFound desc = could not find container \"119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984\": container with ID starting with 119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984 not found: ID does not exist" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.950088 4744 scope.go:117] "RemoveContainer" containerID="c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.950246 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a"} err="failed to get container status \"c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a\": rpc error: code = NotFound desc = could not find container \"c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a\": container with ID starting with c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a not found: ID does not exist" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.950262 4744 scope.go:117] "RemoveContainer" containerID="2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.950549 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd"} err="failed to get container status \"2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd\": rpc error: code = NotFound desc = could not find container \"2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd\": container with ID starting with 2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd not found: ID does not exist" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.950564 4744 scope.go:117] "RemoveContainer" containerID="4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.950735 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d"} err="failed to get container status \"4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d\": rpc error: code = NotFound desc = could not find container \"4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d\": container with ID starting with 4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d not found: ID does not exist" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.950751 4744 scope.go:117] "RemoveContainer" containerID="119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.950942 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984"} err="failed to get container status \"119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984\": rpc error: code = NotFound desc = could not find container \"119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984\": container with ID starting with 119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984 not found: ID does not exist" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.950960 4744 scope.go:117] "RemoveContainer" containerID="c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.951111 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a"} err="failed to get container status \"c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a\": rpc error: code = NotFound desc = could not find container \"c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a\": container with ID starting with c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a not found: ID does not exist" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.951123 4744 scope.go:117] "RemoveContainer" containerID="2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.951314 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd"} err="failed to get container status \"2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd\": rpc error: code = NotFound desc = could not find container \"2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd\": container with ID starting with 2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd not found: ID does not exist" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.951330 4744 scope.go:117] "RemoveContainer" containerID="4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.951576 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d"} err="failed to get container status \"4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d\": rpc error: code = NotFound desc = could not find container \"4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d\": container with ID starting with 4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d not found: ID does not exist" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.951589 4744 scope.go:117] "RemoveContainer" containerID="119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.951898 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984"} err="failed to get container status \"119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984\": rpc error: code = NotFound desc = could not find container \"119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984\": container with ID starting with 119870b75450762f85162dfb41a295ad60e5f25dc26768f08c8bc81d126a1984 not found: ID does not exist" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.951912 4744 scope.go:117] "RemoveContainer" containerID="c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.952125 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a"} err="failed to get container status \"c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a\": rpc error: code = NotFound desc = could not find container \"c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a\": container with ID starting with c5aa94bcf5799ebe49734241644650d1e2081379d8b73da6faca181eb00f0a3a not found: ID does not exist" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.952136 4744 scope.go:117] "RemoveContainer" containerID="2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.952324 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd"} err="failed to get container status \"2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd\": rpc error: code = NotFound desc = could not find container \"2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd\": container with ID starting with 2405b80d4908075f53d3dd6482253bc63874671a0cbd2316f33a8824590690bd not found: ID does not exist" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.952337 4744 scope.go:117] "RemoveContainer" containerID="4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d" Dec 05 20:34:38 crc kubenswrapper[4744]: I1205 20:34:38.952482 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d"} err="failed to get container status \"4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d\": rpc error: code = NotFound desc = could not find container \"4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d\": container with ID starting with 4628ae0230011a421843fb62c4a66de4f01bdc666cf0140ab5df80dce6560a1d not found: ID does not exist" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.041995 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.053714 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.063450 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:34:39 crc kubenswrapper[4744]: E1205 20:34:39.063824 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="proxy-httpd" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.063842 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="proxy-httpd" Dec 05 20:34:39 crc kubenswrapper[4744]: E1205 20:34:39.063854 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="ceilometer-central-agent" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.063861 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="ceilometer-central-agent" Dec 05 20:34:39 crc kubenswrapper[4744]: E1205 20:34:39.063880 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="sg-core" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.063888 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="sg-core" Dec 05 20:34:39 crc kubenswrapper[4744]: E1205 20:34:39.063900 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="ceilometer-notification-agent" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.063906 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="ceilometer-notification-agent" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.064190 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="ceilometer-central-agent" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.064209 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="proxy-httpd" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.064236 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="sg-core" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.064271 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" containerName="ceilometer-notification-agent" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.067627 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.075421 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-5ht55"] Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.075889 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.075947 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.076015 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.111537 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.147868 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-config-data\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.147930 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.147983 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.148101 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3efc84d7-14e5-4b45-9913-5be849b305ee-log-httpd\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.148154 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3efc84d7-14e5-4b45-9913-5be849b305ee-run-httpd\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.148185 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-scripts\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.148231 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.148308 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk5zg\" (UniqueName: \"kubernetes.io/projected/3efc84d7-14e5-4b45-9913-5be849b305ee-kube-api-access-bk5zg\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.163624 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-489c-account-create-update-hh5w9"] Dec 05 20:34:39 crc kubenswrapper[4744]: W1205 20:34:39.164055 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod783b1f69_ef7e_4a6e_8bc6_27efb86e6fca.slice/crio-9a43ab4e59ae0052c25e2d5d88646030c27c30ff7f9f6c586f8496ef4ab72370 WatchSource:0}: Error finding container 9a43ab4e59ae0052c25e2d5d88646030c27c30ff7f9f6c586f8496ef4ab72370: Status 404 returned error can't find the container with id 9a43ab4e59ae0052c25e2d5d88646030c27c30ff7f9f6c586f8496ef4ab72370 Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.252048 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-config-data\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.252314 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.252336 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.252383 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3efc84d7-14e5-4b45-9913-5be849b305ee-log-httpd\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.252406 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3efc84d7-14e5-4b45-9913-5be849b305ee-run-httpd\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.252425 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-scripts\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.252453 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.252476 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk5zg\" (UniqueName: \"kubernetes.io/projected/3efc84d7-14e5-4b45-9913-5be849b305ee-kube-api-access-bk5zg\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.252960 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3efc84d7-14e5-4b45-9913-5be849b305ee-run-httpd\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.253220 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3efc84d7-14e5-4b45-9913-5be849b305ee-log-httpd\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.256881 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.258022 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.260322 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-config-data\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.261672 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.262412 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-scripts\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.279323 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk5zg\" (UniqueName: \"kubernetes.io/projected/3efc84d7-14e5-4b45-9913-5be849b305ee-kube-api-access-bk5zg\") pod \"ceilometer-0\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.415100 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.718241 4744 generic.go:334] "Generic (PLEG): container finished" podID="bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e" containerID="7bd559647e30caa57c089caf9a11afcfb54f1242416956d673e2b1b849c775b1" exitCode=0 Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.718303 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-5ht55" event={"ID":"bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e","Type":"ContainerDied","Data":"7bd559647e30caa57c089caf9a11afcfb54f1242416956d673e2b1b849c775b1"} Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.718578 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-5ht55" event={"ID":"bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e","Type":"ContainerStarted","Data":"3489a3d8290c2a804f4114c8bda152fdf5b024281046a46e5b9051b789ce9416"} Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.721955 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-489c-account-create-update-hh5w9" event={"ID":"783b1f69-ef7e-4a6e-8bc6-27efb86e6fca","Type":"ContainerStarted","Data":"d0875b5d39e0e847f374a2df32cb7bd08b1d2e9efb4d5afac7c9c80a330ccb83"} Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.721997 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-489c-account-create-update-hh5w9" event={"ID":"783b1f69-ef7e-4a6e-8bc6-27efb86e6fca","Type":"ContainerStarted","Data":"9a43ab4e59ae0052c25e2d5d88646030c27c30ff7f9f6c586f8496ef4ab72370"} Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.754867 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-489c-account-create-update-hh5w9" podStartSLOduration=1.754848199 podStartE2EDuration="1.754848199s" podCreationTimestamp="2025-12-05 20:34:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:39.753816514 +0000 UTC m=+1449.983627892" watchObservedRunningTime="2025-12-05 20:34:39.754848199 +0000 UTC m=+1449.984659567" Dec 05 20:34:39 crc kubenswrapper[4744]: I1205 20:34:39.901059 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:34:39 crc kubenswrapper[4744]: W1205 20:34:39.901285 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3efc84d7_14e5_4b45_9913_5be849b305ee.slice/crio-519a6f3dd788295135979850a4bcef8a87e03e268187622162b0d11f123ca828 WatchSource:0}: Error finding container 519a6f3dd788295135979850a4bcef8a87e03e268187622162b0d11f123ca828: Status 404 returned error can't find the container with id 519a6f3dd788295135979850a4bcef8a87e03e268187622162b0d11f123ca828 Dec 05 20:34:40 crc kubenswrapper[4744]: I1205 20:34:40.093359 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1449ab6-9c86-4811-93c8-04f45c613128" path="/var/lib/kubelet/pods/b1449ab6-9c86-4811-93c8-04f45c613128/volumes" Dec 05 20:34:40 crc kubenswrapper[4744]: I1205 20:34:40.730730 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3efc84d7-14e5-4b45-9913-5be849b305ee","Type":"ContainerStarted","Data":"c68a8721639ff6780a5e47c3baff7e7748c1e124c1430194d0fd371c4946756a"} Dec 05 20:34:40 crc kubenswrapper[4744]: I1205 20:34:40.731050 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3efc84d7-14e5-4b45-9913-5be849b305ee","Type":"ContainerStarted","Data":"519a6f3dd788295135979850a4bcef8a87e03e268187622162b0d11f123ca828"} Dec 05 20:34:40 crc kubenswrapper[4744]: I1205 20:34:40.733013 4744 generic.go:334] "Generic (PLEG): container finished" podID="783b1f69-ef7e-4a6e-8bc6-27efb86e6fca" containerID="d0875b5d39e0e847f374a2df32cb7bd08b1d2e9efb4d5afac7c9c80a330ccb83" exitCode=0 Dec 05 20:34:40 crc kubenswrapper[4744]: I1205 20:34:40.733101 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-489c-account-create-update-hh5w9" event={"ID":"783b1f69-ef7e-4a6e-8bc6-27efb86e6fca","Type":"ContainerDied","Data":"d0875b5d39e0e847f374a2df32cb7bd08b1d2e9efb4d5afac7c9c80a330ccb83"} Dec 05 20:34:41 crc kubenswrapper[4744]: I1205 20:34:41.104871 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-5ht55" Dec 05 20:34:41 crc kubenswrapper[4744]: I1205 20:34:41.285680 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e-operator-scripts\") pod \"bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e\" (UID: \"bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e\") " Dec 05 20:34:41 crc kubenswrapper[4744]: I1205 20:34:41.285767 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2ljz\" (UniqueName: \"kubernetes.io/projected/bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e-kube-api-access-m2ljz\") pod \"bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e\" (UID: \"bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e\") " Dec 05 20:34:41 crc kubenswrapper[4744]: I1205 20:34:41.286702 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e" (UID: "bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:41 crc kubenswrapper[4744]: I1205 20:34:41.299160 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e-kube-api-access-m2ljz" (OuterVolumeSpecName: "kube-api-access-m2ljz") pod "bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e" (UID: "bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e"). InnerVolumeSpecName "kube-api-access-m2ljz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:41 crc kubenswrapper[4744]: I1205 20:34:41.388001 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:41 crc kubenswrapper[4744]: I1205 20:34:41.388033 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2ljz\" (UniqueName: \"kubernetes.io/projected/bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e-kube-api-access-m2ljz\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:41 crc kubenswrapper[4744]: I1205 20:34:41.744370 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-5ht55" event={"ID":"bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e","Type":"ContainerDied","Data":"3489a3d8290c2a804f4114c8bda152fdf5b024281046a46e5b9051b789ce9416"} Dec 05 20:34:41 crc kubenswrapper[4744]: I1205 20:34:41.744683 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3489a3d8290c2a804f4114c8bda152fdf5b024281046a46e5b9051b789ce9416" Dec 05 20:34:41 crc kubenswrapper[4744]: I1205 20:34:41.744757 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-5ht55" Dec 05 20:34:41 crc kubenswrapper[4744]: I1205 20:34:41.749887 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3efc84d7-14e5-4b45-9913-5be849b305ee","Type":"ContainerStarted","Data":"89c0f0f6d1336044c620f54079aa4b81b5eaa806807f8ceb58c77418c5afa7d5"} Dec 05 20:34:42 crc kubenswrapper[4744]: I1205 20:34:42.194894 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-489c-account-create-update-hh5w9" Dec 05 20:34:42 crc kubenswrapper[4744]: I1205 20:34:42.303957 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/783b1f69-ef7e-4a6e-8bc6-27efb86e6fca-operator-scripts\") pod \"783b1f69-ef7e-4a6e-8bc6-27efb86e6fca\" (UID: \"783b1f69-ef7e-4a6e-8bc6-27efb86e6fca\") " Dec 05 20:34:42 crc kubenswrapper[4744]: I1205 20:34:42.304404 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783b1f69-ef7e-4a6e-8bc6-27efb86e6fca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "783b1f69-ef7e-4a6e-8bc6-27efb86e6fca" (UID: "783b1f69-ef7e-4a6e-8bc6-27efb86e6fca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:34:42 crc kubenswrapper[4744]: I1205 20:34:42.304417 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65lx2\" (UniqueName: \"kubernetes.io/projected/783b1f69-ef7e-4a6e-8bc6-27efb86e6fca-kube-api-access-65lx2\") pod \"783b1f69-ef7e-4a6e-8bc6-27efb86e6fca\" (UID: \"783b1f69-ef7e-4a6e-8bc6-27efb86e6fca\") " Dec 05 20:34:42 crc kubenswrapper[4744]: I1205 20:34:42.305020 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/783b1f69-ef7e-4a6e-8bc6-27efb86e6fca-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:42 crc kubenswrapper[4744]: I1205 20:34:42.319563 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783b1f69-ef7e-4a6e-8bc6-27efb86e6fca-kube-api-access-65lx2" (OuterVolumeSpecName: "kube-api-access-65lx2") pod "783b1f69-ef7e-4a6e-8bc6-27efb86e6fca" (UID: "783b1f69-ef7e-4a6e-8bc6-27efb86e6fca"). InnerVolumeSpecName "kube-api-access-65lx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:42 crc kubenswrapper[4744]: I1205 20:34:42.407011 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65lx2\" (UniqueName: \"kubernetes.io/projected/783b1f69-ef7e-4a6e-8bc6-27efb86e6fca-kube-api-access-65lx2\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:42 crc kubenswrapper[4744]: I1205 20:34:42.769006 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3efc84d7-14e5-4b45-9913-5be849b305ee","Type":"ContainerStarted","Data":"9f51ded012562761ae7cc1414729222e1030c498d1ac491ef6f97d94f83e559c"} Dec 05 20:34:42 crc kubenswrapper[4744]: I1205 20:34:42.772085 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-489c-account-create-update-hh5w9" event={"ID":"783b1f69-ef7e-4a6e-8bc6-27efb86e6fca","Type":"ContainerDied","Data":"9a43ab4e59ae0052c25e2d5d88646030c27c30ff7f9f6c586f8496ef4ab72370"} Dec 05 20:34:42 crc kubenswrapper[4744]: I1205 20:34:42.772143 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a43ab4e59ae0052c25e2d5d88646030c27c30ff7f9f6c586f8496ef4ab72370" Dec 05 20:34:42 crc kubenswrapper[4744]: I1205 20:34:42.772178 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-489c-account-create-update-hh5w9" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.506778 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-shh58"] Dec 05 20:34:43 crc kubenswrapper[4744]: E1205 20:34:43.507382 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e" containerName="mariadb-database-create" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.507398 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e" containerName="mariadb-database-create" Dec 05 20:34:43 crc kubenswrapper[4744]: E1205 20:34:43.507420 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783b1f69-ef7e-4a6e-8bc6-27efb86e6fca" containerName="mariadb-account-create-update" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.507427 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="783b1f69-ef7e-4a6e-8bc6-27efb86e6fca" containerName="mariadb-account-create-update" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.507569 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="783b1f69-ef7e-4a6e-8bc6-27efb86e6fca" containerName="mariadb-account-create-update" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.507588 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e" containerName="mariadb-database-create" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.508097 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.510219 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-bnd8x" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.511551 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.525300 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-shh58"] Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.626539 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-config-data\") pod \"watcher-kuttl-db-sync-shh58\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.627098 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-db-sync-config-data\") pod \"watcher-kuttl-db-sync-shh58\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.627340 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-shh58\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.627407 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmccd\" (UniqueName: \"kubernetes.io/projected/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-kube-api-access-wmccd\") pod \"watcher-kuttl-db-sync-shh58\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.728805 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-config-data\") pod \"watcher-kuttl-db-sync-shh58\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.728869 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-db-sync-config-data\") pod \"watcher-kuttl-db-sync-shh58\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.728905 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-shh58\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.728968 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmccd\" (UniqueName: \"kubernetes.io/projected/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-kube-api-access-wmccd\") pod \"watcher-kuttl-db-sync-shh58\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.735599 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-config-data\") pod \"watcher-kuttl-db-sync-shh58\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.736050 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-shh58\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.743869 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-db-sync-config-data\") pod \"watcher-kuttl-db-sync-shh58\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.750263 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmccd\" (UniqueName: \"kubernetes.io/projected/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-kube-api-access-wmccd\") pod \"watcher-kuttl-db-sync-shh58\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.784778 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3efc84d7-14e5-4b45-9913-5be849b305ee","Type":"ContainerStarted","Data":"907cd745d35bc32e4192938fd9d110d9799cac27617b634baafcb9e67964cddf"} Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.786798 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.830187 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.885701075 podStartE2EDuration="4.830161267s" podCreationTimestamp="2025-12-05 20:34:39 +0000 UTC" firstStartedPulling="2025-12-05 20:34:39.903387405 +0000 UTC m=+1450.133198773" lastFinishedPulling="2025-12-05 20:34:42.847847597 +0000 UTC m=+1453.077658965" observedRunningTime="2025-12-05 20:34:43.821451493 +0000 UTC m=+1454.051262891" watchObservedRunningTime="2025-12-05 20:34:43.830161267 +0000 UTC m=+1454.059972675" Dec 05 20:34:43 crc kubenswrapper[4744]: I1205 20:34:43.832694 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:44 crc kubenswrapper[4744]: I1205 20:34:44.285739 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-shh58"] Dec 05 20:34:44 crc kubenswrapper[4744]: W1205 20:34:44.290277 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64a1d3a4_e237_4ac2_92fa_3521d264d0f0.slice/crio-9e9e42f21c95b6f511fde5f43c3d55404cd71a91dc15275f6cfe219cf005e20e WatchSource:0}: Error finding container 9e9e42f21c95b6f511fde5f43c3d55404cd71a91dc15275f6cfe219cf005e20e: Status 404 returned error can't find the container with id 9e9e42f21c95b6f511fde5f43c3d55404cd71a91dc15275f6cfe219cf005e20e Dec 05 20:34:44 crc kubenswrapper[4744]: I1205 20:34:44.795117 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" event={"ID":"64a1d3a4-e237-4ac2-92fa-3521d264d0f0","Type":"ContainerStarted","Data":"c56c20d58f662ebca38e439b26d62ea3ef945a21c219b1b59b56db3ec172b337"} Dec 05 20:34:44 crc kubenswrapper[4744]: I1205 20:34:44.796327 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" event={"ID":"64a1d3a4-e237-4ac2-92fa-3521d264d0f0","Type":"ContainerStarted","Data":"9e9e42f21c95b6f511fde5f43c3d55404cd71a91dc15275f6cfe219cf005e20e"} Dec 05 20:34:44 crc kubenswrapper[4744]: I1205 20:34:44.813627 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" podStartSLOduration=1.8136129859999999 podStartE2EDuration="1.813612986s" podCreationTimestamp="2025-12-05 20:34:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:44.809818383 +0000 UTC m=+1455.039629741" watchObservedRunningTime="2025-12-05 20:34:44.813612986 +0000 UTC m=+1455.043424354" Dec 05 20:34:47 crc kubenswrapper[4744]: I1205 20:34:47.827952 4744 generic.go:334] "Generic (PLEG): container finished" podID="64a1d3a4-e237-4ac2-92fa-3521d264d0f0" containerID="c56c20d58f662ebca38e439b26d62ea3ef945a21c219b1b59b56db3ec172b337" exitCode=0 Dec 05 20:34:47 crc kubenswrapper[4744]: I1205 20:34:47.828027 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" event={"ID":"64a1d3a4-e237-4ac2-92fa-3521d264d0f0","Type":"ContainerDied","Data":"c56c20d58f662ebca38e439b26d62ea3ef945a21c219b1b59b56db3ec172b337"} Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.223418 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.316713 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-combined-ca-bundle\") pod \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.317089 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmccd\" (UniqueName: \"kubernetes.io/projected/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-kube-api-access-wmccd\") pod \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.317154 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-db-sync-config-data\") pod \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.317182 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-config-data\") pod \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\" (UID: \"64a1d3a4-e237-4ac2-92fa-3521d264d0f0\") " Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.322438 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "64a1d3a4-e237-4ac2-92fa-3521d264d0f0" (UID: "64a1d3a4-e237-4ac2-92fa-3521d264d0f0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.322559 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-kube-api-access-wmccd" (OuterVolumeSpecName: "kube-api-access-wmccd") pod "64a1d3a4-e237-4ac2-92fa-3521d264d0f0" (UID: "64a1d3a4-e237-4ac2-92fa-3521d264d0f0"). InnerVolumeSpecName "kube-api-access-wmccd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.340057 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64a1d3a4-e237-4ac2-92fa-3521d264d0f0" (UID: "64a1d3a4-e237-4ac2-92fa-3521d264d0f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.371197 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-config-data" (OuterVolumeSpecName: "config-data") pod "64a1d3a4-e237-4ac2-92fa-3521d264d0f0" (UID: "64a1d3a4-e237-4ac2-92fa-3521d264d0f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.419325 4744 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.419357 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.419366 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.419374 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmccd\" (UniqueName: \"kubernetes.io/projected/64a1d3a4-e237-4ac2-92fa-3521d264d0f0-kube-api-access-wmccd\") on node \"crc\" DevicePath \"\"" Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.807279 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.807400 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.851153 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" event={"ID":"64a1d3a4-e237-4ac2-92fa-3521d264d0f0","Type":"ContainerDied","Data":"9e9e42f21c95b6f511fde5f43c3d55404cd71a91dc15275f6cfe219cf005e20e"} Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.851197 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e9e42f21c95b6f511fde5f43c3d55404cd71a91dc15275f6cfe219cf005e20e" Dec 05 20:34:49 crc kubenswrapper[4744]: I1205 20:34:49.851256 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-shh58" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.115153 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:34:50 crc kubenswrapper[4744]: E1205 20:34:50.115567 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a1d3a4-e237-4ac2-92fa-3521d264d0f0" containerName="watcher-kuttl-db-sync" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.115591 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a1d3a4-e237-4ac2-92fa-3521d264d0f0" containerName="watcher-kuttl-db-sync" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.115829 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="64a1d3a4-e237-4ac2-92fa-3521d264d0f0" containerName="watcher-kuttl-db-sync" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.116916 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.119555 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.120133 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-bnd8x" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.124709 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.125721 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.127111 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.129915 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l66ht\" (UniqueName: \"kubernetes.io/projected/8397b5a7-4151-4c3b-a1c9-199df35771d9-kube-api-access-l66ht\") pod \"watcher-kuttl-api-0\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.129976 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8397b5a7-4151-4c3b-a1c9-199df35771d9-logs\") pod \"watcher-kuttl-api-0\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.130002 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.130052 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjhgv\" (UniqueName: \"kubernetes.io/projected/fe602728-cd71-4a0d-8a05-08c295621691-kube-api-access-zjhgv\") pod \"watcher-kuttl-applier-0\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.130124 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe602728-cd71-4a0d-8a05-08c295621691-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.130155 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe602728-cd71-4a0d-8a05-08c295621691-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.130220 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe602728-cd71-4a0d-8a05-08c295621691-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.130269 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.130354 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.139004 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.165345 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.202428 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.223256 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.227783 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.231197 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjhgv\" (UniqueName: \"kubernetes.io/projected/fe602728-cd71-4a0d-8a05-08c295621691-kube-api-access-zjhgv\") pod \"watcher-kuttl-applier-0\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.231252 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe602728-cd71-4a0d-8a05-08c295621691-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.231338 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe602728-cd71-4a0d-8a05-08c295621691-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.231388 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.231416 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe602728-cd71-4a0d-8a05-08c295621691-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.231453 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.231480 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.231516 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.231536 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.231643 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l66ht\" (UniqueName: \"kubernetes.io/projected/8397b5a7-4151-4c3b-a1c9-199df35771d9-kube-api-access-l66ht\") pod \"watcher-kuttl-api-0\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.231693 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.231719 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8397b5a7-4151-4c3b-a1c9-199df35771d9-logs\") pod \"watcher-kuttl-api-0\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.231741 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.231844 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7tdw\" (UniqueName: \"kubernetes.io/projected/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-kube-api-access-r7tdw\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.234817 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe602728-cd71-4a0d-8a05-08c295621691-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.236995 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8397b5a7-4151-4c3b-a1c9-199df35771d9-logs\") pod \"watcher-kuttl-api-0\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.237870 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.238496 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe602728-cd71-4a0d-8a05-08c295621691-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.238668 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe602728-cd71-4a0d-8a05-08c295621691-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.238686 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.243282 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.246325 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.251980 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l66ht\" (UniqueName: \"kubernetes.io/projected/8397b5a7-4151-4c3b-a1c9-199df35771d9-kube-api-access-l66ht\") pod \"watcher-kuttl-api-0\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.252081 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjhgv\" (UniqueName: \"kubernetes.io/projected/fe602728-cd71-4a0d-8a05-08c295621691-kube-api-access-zjhgv\") pod \"watcher-kuttl-applier-0\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.332868 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7tdw\" (UniqueName: \"kubernetes.io/projected/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-kube-api-access-r7tdw\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.333193 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.333232 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.333261 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.333314 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.333810 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.336904 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.337925 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.338304 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.346818 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7tdw\" (UniqueName: \"kubernetes.io/projected/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-kube-api-access-r7tdw\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.432191 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.459237 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.606912 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.922347 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:34:50 crc kubenswrapper[4744]: I1205 20:34:50.968704 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:34:51 crc kubenswrapper[4744]: I1205 20:34:51.196278 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:34:51 crc kubenswrapper[4744]: W1205 20:34:51.201426 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc5f0e4a_62fa_4b20_bb8f_4de79b50dc36.slice/crio-735b6933d79895c4701a7a64ee846ce68039828128de8e94e116d0bd41669fee WatchSource:0}: Error finding container 735b6933d79895c4701a7a64ee846ce68039828128de8e94e116d0bd41669fee: Status 404 returned error can't find the container with id 735b6933d79895c4701a7a64ee846ce68039828128de8e94e116d0bd41669fee Dec 05 20:34:51 crc kubenswrapper[4744]: I1205 20:34:51.886743 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"8397b5a7-4151-4c3b-a1c9-199df35771d9","Type":"ContainerStarted","Data":"ca1b5f30478bd337aeaa00fb47fc8b774f1c74d136d2283d8e7ca9d1135f54ff"} Dec 05 20:34:51 crc kubenswrapper[4744]: I1205 20:34:51.887061 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"8397b5a7-4151-4c3b-a1c9-199df35771d9","Type":"ContainerStarted","Data":"81f1367d96f1bf07e013614a1bde2794dd5eef9ac20190f1f05118521668789b"} Dec 05 20:34:51 crc kubenswrapper[4744]: I1205 20:34:51.889314 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36","Type":"ContainerStarted","Data":"cbb592065d2e1d740cf22a92c9dfcf7515f80ac65142629522be89dad31a9167"} Dec 05 20:34:51 crc kubenswrapper[4744]: I1205 20:34:51.889345 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36","Type":"ContainerStarted","Data":"735b6933d79895c4701a7a64ee846ce68039828128de8e94e116d0bd41669fee"} Dec 05 20:34:51 crc kubenswrapper[4744]: I1205 20:34:51.890865 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"fe602728-cd71-4a0d-8a05-08c295621691","Type":"ContainerStarted","Data":"7efa5908db0ba563ad2513dc7c8154f8c81e50ce36ccc7d7e1be0601cceb74b4"} Dec 05 20:34:51 crc kubenswrapper[4744]: I1205 20:34:51.890888 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"fe602728-cd71-4a0d-8a05-08c295621691","Type":"ContainerStarted","Data":"deaa3d25d8c22a8df2a1bb5a1bd0ed5e6ba1718e73f47662d49dfa1a6af3929d"} Dec 05 20:34:51 crc kubenswrapper[4744]: I1205 20:34:51.913269 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.913249416 podStartE2EDuration="1.913249416s" podCreationTimestamp="2025-12-05 20:34:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:51.905507076 +0000 UTC m=+1462.135318444" watchObservedRunningTime="2025-12-05 20:34:51.913249416 +0000 UTC m=+1462.143060784" Dec 05 20:34:51 crc kubenswrapper[4744]: I1205 20:34:51.921180 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=1.92116278 podStartE2EDuration="1.92116278s" podCreationTimestamp="2025-12-05 20:34:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:51.920800802 +0000 UTC m=+1462.150612180" watchObservedRunningTime="2025-12-05 20:34:51.92116278 +0000 UTC m=+1462.150974148" Dec 05 20:34:52 crc kubenswrapper[4744]: I1205 20:34:52.902911 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"8397b5a7-4151-4c3b-a1c9-199df35771d9","Type":"ContainerStarted","Data":"459e3d5c0d306564a3666de2a1828256b44e906766ae6fe1639db84c71df8b1b"} Dec 05 20:34:52 crc kubenswrapper[4744]: I1205 20:34:52.903824 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:52 crc kubenswrapper[4744]: I1205 20:34:52.925803 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.925785528 podStartE2EDuration="2.925785528s" podCreationTimestamp="2025-12-05 20:34:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:34:52.921404671 +0000 UTC m=+1463.151216049" watchObservedRunningTime="2025-12-05 20:34:52.925785528 +0000 UTC m=+1463.155596916" Dec 05 20:34:55 crc kubenswrapper[4744]: I1205 20:34:55.247487 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:55 crc kubenswrapper[4744]: I1205 20:34:55.432638 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:34:55 crc kubenswrapper[4744]: I1205 20:34:55.460177 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:00 crc kubenswrapper[4744]: I1205 20:35:00.433253 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:00 crc kubenswrapper[4744]: I1205 20:35:00.445968 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:00 crc kubenswrapper[4744]: I1205 20:35:00.459534 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:00 crc kubenswrapper[4744]: I1205 20:35:00.497201 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:00 crc kubenswrapper[4744]: I1205 20:35:00.607938 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:00 crc kubenswrapper[4744]: I1205 20:35:00.649307 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:00 crc kubenswrapper[4744]: I1205 20:35:00.914421 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:00 crc kubenswrapper[4744]: I1205 20:35:00.918445 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:00 crc kubenswrapper[4744]: I1205 20:35:00.945981 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:00 crc kubenswrapper[4744]: I1205 20:35:00.946589 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.778651 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-shh58"] Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.787704 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-shh58"] Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.803766 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher489c-account-delete-94nmg"] Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.804801 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher489c-account-delete-94nmg" Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.819762 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher489c-account-delete-94nmg"] Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.873848 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4gzp\" (UniqueName: \"kubernetes.io/projected/a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9-kube-api-access-b4gzp\") pod \"watcher489c-account-delete-94nmg\" (UID: \"a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9\") " pod="watcher-kuttl-default/watcher489c-account-delete-94nmg" Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.874136 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9-operator-scripts\") pod \"watcher489c-account-delete-94nmg\" (UID: \"a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9\") " pod="watcher-kuttl-default/watcher489c-account-delete-94nmg" Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.876565 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.906168 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.928122 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="fe602728-cd71-4a0d-8a05-08c295621691" containerName="watcher-applier" containerID="cri-o://7efa5908db0ba563ad2513dc7c8154f8c81e50ce36ccc7d7e1be0601cceb74b4" gracePeriod=30 Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.928452 4744 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" secret="" err="secret \"watcher-watcher-kuttl-dockercfg-bnd8x\" not found" Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.975675 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4gzp\" (UniqueName: \"kubernetes.io/projected/a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9-kube-api-access-b4gzp\") pod \"watcher489c-account-delete-94nmg\" (UID: \"a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9\") " pod="watcher-kuttl-default/watcher489c-account-delete-94nmg" Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.975717 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9-operator-scripts\") pod \"watcher489c-account-delete-94nmg\" (UID: \"a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9\") " pod="watcher-kuttl-default/watcher489c-account-delete-94nmg" Dec 05 20:35:02 crc kubenswrapper[4744]: E1205 20:35:02.975839 4744 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:35:02 crc kubenswrapper[4744]: E1205 20:35:02.975890 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-config-data podName:cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36 nodeName:}" failed. No retries permitted until 2025-12-05 20:35:03.475872737 +0000 UTC m=+1473.705684105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36") : secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.978253 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9-operator-scripts\") pod \"watcher489c-account-delete-94nmg\" (UID: \"a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9\") " pod="watcher-kuttl-default/watcher489c-account-delete-94nmg" Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.982834 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.983032 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="8397b5a7-4151-4c3b-a1c9-199df35771d9" containerName="watcher-kuttl-api-log" containerID="cri-o://ca1b5f30478bd337aeaa00fb47fc8b774f1c74d136d2283d8e7ca9d1135f54ff" gracePeriod=30 Dec 05 20:35:02 crc kubenswrapper[4744]: I1205 20:35:02.983368 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="8397b5a7-4151-4c3b-a1c9-199df35771d9" containerName="watcher-api" containerID="cri-o://459e3d5c0d306564a3666de2a1828256b44e906766ae6fe1639db84c71df8b1b" gracePeriod=30 Dec 05 20:35:03 crc kubenswrapper[4744]: I1205 20:35:03.003813 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4gzp\" (UniqueName: \"kubernetes.io/projected/a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9-kube-api-access-b4gzp\") pod \"watcher489c-account-delete-94nmg\" (UID: \"a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9\") " pod="watcher-kuttl-default/watcher489c-account-delete-94nmg" Dec 05 20:35:03 crc kubenswrapper[4744]: I1205 20:35:03.132692 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher489c-account-delete-94nmg" Dec 05 20:35:03 crc kubenswrapper[4744]: I1205 20:35:03.473064 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:03 crc kubenswrapper[4744]: I1205 20:35:03.473395 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="ceilometer-central-agent" containerID="cri-o://c68a8721639ff6780a5e47c3baff7e7748c1e124c1430194d0fd371c4946756a" gracePeriod=30 Dec 05 20:35:03 crc kubenswrapper[4744]: I1205 20:35:03.474173 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="proxy-httpd" containerID="cri-o://907cd745d35bc32e4192938fd9d110d9799cac27617b634baafcb9e67964cddf" gracePeriod=30 Dec 05 20:35:03 crc kubenswrapper[4744]: I1205 20:35:03.474233 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="sg-core" containerID="cri-o://9f51ded012562761ae7cc1414729222e1030c498d1ac491ef6f97d94f83e559c" gracePeriod=30 Dec 05 20:35:03 crc kubenswrapper[4744]: I1205 20:35:03.474282 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="ceilometer-notification-agent" containerID="cri-o://89c0f0f6d1336044c620f54079aa4b81b5eaa806807f8ceb58c77418c5afa7d5" gracePeriod=30 Dec 05 20:35:03 crc kubenswrapper[4744]: E1205 20:35:03.484705 4744 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:35:03 crc kubenswrapper[4744]: E1205 20:35:03.484783 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-config-data podName:cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36 nodeName:}" failed. No retries permitted until 2025-12-05 20:35:04.484765428 +0000 UTC m=+1474.714576796 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36") : secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:35:03 crc kubenswrapper[4744]: W1205 20:35:03.557391 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6fbdeb6_0aa3_4191_b1a3_1f61d99e38a9.slice/crio-d8dcad3c9235c83c6302f4d9e680e0af9e1994e678b8152ef8c64c8d33019bd8 WatchSource:0}: Error finding container d8dcad3c9235c83c6302f4d9e680e0af9e1994e678b8152ef8c64c8d33019bd8: Status 404 returned error can't find the container with id d8dcad3c9235c83c6302f4d9e680e0af9e1994e678b8152ef8c64c8d33019bd8 Dec 05 20:35:03 crc kubenswrapper[4744]: I1205 20:35:03.558437 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher489c-account-delete-94nmg"] Dec 05 20:35:03 crc kubenswrapper[4744]: I1205 20:35:03.587449 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.140:3000/\": read tcp 10.217.0.2:58674->10.217.0.140:3000: read: connection reset by peer" Dec 05 20:35:03 crc kubenswrapper[4744]: I1205 20:35:03.937831 4744 generic.go:334] "Generic (PLEG): container finished" podID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerID="9f51ded012562761ae7cc1414729222e1030c498d1ac491ef6f97d94f83e559c" exitCode=2 Dec 05 20:35:03 crc kubenswrapper[4744]: I1205 20:35:03.937904 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3efc84d7-14e5-4b45-9913-5be849b305ee","Type":"ContainerDied","Data":"9f51ded012562761ae7cc1414729222e1030c498d1ac491ef6f97d94f83e559c"} Dec 05 20:35:03 crc kubenswrapper[4744]: I1205 20:35:03.940637 4744 generic.go:334] "Generic (PLEG): container finished" podID="8397b5a7-4151-4c3b-a1c9-199df35771d9" containerID="ca1b5f30478bd337aeaa00fb47fc8b774f1c74d136d2283d8e7ca9d1135f54ff" exitCode=143 Dec 05 20:35:03 crc kubenswrapper[4744]: I1205 20:35:03.940705 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"8397b5a7-4151-4c3b-a1c9-199df35771d9","Type":"ContainerDied","Data":"ca1b5f30478bd337aeaa00fb47fc8b774f1c74d136d2283d8e7ca9d1135f54ff"} Dec 05 20:35:03 crc kubenswrapper[4744]: I1205 20:35:03.941800 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher489c-account-delete-94nmg" event={"ID":"a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9","Type":"ContainerStarted","Data":"d8dcad3c9235c83c6302f4d9e680e0af9e1994e678b8152ef8c64c8d33019bd8"} Dec 05 20:35:03 crc kubenswrapper[4744]: I1205 20:35:03.941914 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36" containerName="watcher-decision-engine" containerID="cri-o://cbb592065d2e1d740cf22a92c9dfcf7515f80ac65142629522be89dad31a9167" gracePeriod=30 Dec 05 20:35:04 crc kubenswrapper[4744]: I1205 20:35:04.090761 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64a1d3a4-e237-4ac2-92fa-3521d264d0f0" path="/var/lib/kubelet/pods/64a1d3a4-e237-4ac2-92fa-3521d264d0f0/volumes" Dec 05 20:35:04 crc kubenswrapper[4744]: E1205 20:35:04.500913 4744 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:35:04 crc kubenswrapper[4744]: E1205 20:35:04.501040 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-config-data podName:cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36 nodeName:}" failed. No retries permitted until 2025-12-05 20:35:06.501023842 +0000 UTC m=+1476.730835210 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36") : secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:35:04 crc kubenswrapper[4744]: I1205 20:35:04.957190 4744 generic.go:334] "Generic (PLEG): container finished" podID="a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9" containerID="0bf1083548f5d39ef55e8658a898e3970cc658bd2c5be4fb2dbaddf47052a6b7" exitCode=0 Dec 05 20:35:04 crc kubenswrapper[4744]: I1205 20:35:04.957465 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher489c-account-delete-94nmg" event={"ID":"a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9","Type":"ContainerDied","Data":"0bf1083548f5d39ef55e8658a898e3970cc658bd2c5be4fb2dbaddf47052a6b7"} Dec 05 20:35:04 crc kubenswrapper[4744]: I1205 20:35:04.960429 4744 generic.go:334] "Generic (PLEG): container finished" podID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerID="907cd745d35bc32e4192938fd9d110d9799cac27617b634baafcb9e67964cddf" exitCode=0 Dec 05 20:35:04 crc kubenswrapper[4744]: I1205 20:35:04.960447 4744 generic.go:334] "Generic (PLEG): container finished" podID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerID="c68a8721639ff6780a5e47c3baff7e7748c1e124c1430194d0fd371c4946756a" exitCode=0 Dec 05 20:35:04 crc kubenswrapper[4744]: I1205 20:35:04.960475 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3efc84d7-14e5-4b45-9913-5be849b305ee","Type":"ContainerDied","Data":"907cd745d35bc32e4192938fd9d110d9799cac27617b634baafcb9e67964cddf"} Dec 05 20:35:04 crc kubenswrapper[4744]: I1205 20:35:04.960494 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3efc84d7-14e5-4b45-9913-5be849b305ee","Type":"ContainerDied","Data":"c68a8721639ff6780a5e47c3baff7e7748c1e124c1430194d0fd371c4946756a"} Dec 05 20:35:04 crc kubenswrapper[4744]: I1205 20:35:04.962012 4744 generic.go:334] "Generic (PLEG): container finished" podID="8397b5a7-4151-4c3b-a1c9-199df35771d9" containerID="459e3d5c0d306564a3666de2a1828256b44e906766ae6fe1639db84c71df8b1b" exitCode=0 Dec 05 20:35:04 crc kubenswrapper[4744]: I1205 20:35:04.962041 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"8397b5a7-4151-4c3b-a1c9-199df35771d9","Type":"ContainerDied","Data":"459e3d5c0d306564a3666de2a1828256b44e906766ae6fe1639db84c71df8b1b"} Dec 05 20:35:04 crc kubenswrapper[4744]: I1205 20:35:04.962061 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"8397b5a7-4151-4c3b-a1c9-199df35771d9","Type":"ContainerDied","Data":"81f1367d96f1bf07e013614a1bde2794dd5eef9ac20190f1f05118521668789b"} Dec 05 20:35:04 crc kubenswrapper[4744]: I1205 20:35:04.962073 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81f1367d96f1bf07e013614a1bde2794dd5eef9ac20190f1f05118521668789b" Dec 05 20:35:04 crc kubenswrapper[4744]: I1205 20:35:04.992231 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.110589 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-combined-ca-bundle\") pod \"8397b5a7-4151-4c3b-a1c9-199df35771d9\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.110634 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l66ht\" (UniqueName: \"kubernetes.io/projected/8397b5a7-4151-4c3b-a1c9-199df35771d9-kube-api-access-l66ht\") pod \"8397b5a7-4151-4c3b-a1c9-199df35771d9\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.110669 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-custom-prometheus-ca\") pod \"8397b5a7-4151-4c3b-a1c9-199df35771d9\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.110848 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-config-data\") pod \"8397b5a7-4151-4c3b-a1c9-199df35771d9\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.110881 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8397b5a7-4151-4c3b-a1c9-199df35771d9-logs\") pod \"8397b5a7-4151-4c3b-a1c9-199df35771d9\" (UID: \"8397b5a7-4151-4c3b-a1c9-199df35771d9\") " Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.111339 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8397b5a7-4151-4c3b-a1c9-199df35771d9-logs" (OuterVolumeSpecName: "logs") pod "8397b5a7-4151-4c3b-a1c9-199df35771d9" (UID: "8397b5a7-4151-4c3b-a1c9-199df35771d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.118654 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8397b5a7-4151-4c3b-a1c9-199df35771d9-kube-api-access-l66ht" (OuterVolumeSpecName: "kube-api-access-l66ht") pod "8397b5a7-4151-4c3b-a1c9-199df35771d9" (UID: "8397b5a7-4151-4c3b-a1c9-199df35771d9"). InnerVolumeSpecName "kube-api-access-l66ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.138815 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8397b5a7-4151-4c3b-a1c9-199df35771d9" (UID: "8397b5a7-4151-4c3b-a1c9-199df35771d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.139260 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "8397b5a7-4151-4c3b-a1c9-199df35771d9" (UID: "8397b5a7-4151-4c3b-a1c9-199df35771d9"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.158409 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-config-data" (OuterVolumeSpecName: "config-data") pod "8397b5a7-4151-4c3b-a1c9-199df35771d9" (UID: "8397b5a7-4151-4c3b-a1c9-199df35771d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.213239 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.213280 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l66ht\" (UniqueName: \"kubernetes.io/projected/8397b5a7-4151-4c3b-a1c9-199df35771d9-kube-api-access-l66ht\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.213313 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.213327 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8397b5a7-4151-4c3b-a1c9-199df35771d9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.213337 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8397b5a7-4151-4c3b-a1c9-199df35771d9-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:05 crc kubenswrapper[4744]: E1205 20:35:05.463010 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7efa5908db0ba563ad2513dc7c8154f8c81e50ce36ccc7d7e1be0601cceb74b4" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:35:05 crc kubenswrapper[4744]: E1205 20:35:05.464267 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7efa5908db0ba563ad2513dc7c8154f8c81e50ce36ccc7d7e1be0601cceb74b4" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:35:05 crc kubenswrapper[4744]: E1205 20:35:05.465642 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7efa5908db0ba563ad2513dc7c8154f8c81e50ce36ccc7d7e1be0601cceb74b4" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:35:05 crc kubenswrapper[4744]: E1205 20:35:05.465675 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="fe602728-cd71-4a0d-8a05-08c295621691" containerName="watcher-applier" Dec 05 20:35:05 crc kubenswrapper[4744]: I1205 20:35:05.979141 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:06 crc kubenswrapper[4744]: I1205 20:35:06.028021 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:35:06 crc kubenswrapper[4744]: I1205 20:35:06.047553 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:35:06 crc kubenswrapper[4744]: I1205 20:35:06.089272 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8397b5a7-4151-4c3b-a1c9-199df35771d9" path="/var/lib/kubelet/pods/8397b5a7-4151-4c3b-a1c9-199df35771d9/volumes" Dec 05 20:35:06 crc kubenswrapper[4744]: E1205 20:35:06.171497 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8397b5a7_4151_4c3b_a1c9_199df35771d9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8397b5a7_4151_4c3b_a1c9_199df35771d9.slice/crio-81f1367d96f1bf07e013614a1bde2794dd5eef9ac20190f1f05118521668789b\": RecentStats: unable to find data in memory cache]" Dec 05 20:35:06 crc kubenswrapper[4744]: I1205 20:35:06.362627 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher489c-account-delete-94nmg" Dec 05 20:35:06 crc kubenswrapper[4744]: I1205 20:35:06.432558 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9-operator-scripts\") pod \"a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9\" (UID: \"a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9\") " Dec 05 20:35:06 crc kubenswrapper[4744]: I1205 20:35:06.432716 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4gzp\" (UniqueName: \"kubernetes.io/projected/a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9-kube-api-access-b4gzp\") pod \"a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9\" (UID: \"a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9\") " Dec 05 20:35:06 crc kubenswrapper[4744]: I1205 20:35:06.433705 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9" (UID: "a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:06 crc kubenswrapper[4744]: I1205 20:35:06.442036 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9-kube-api-access-b4gzp" (OuterVolumeSpecName: "kube-api-access-b4gzp") pod "a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9" (UID: "a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9"). InnerVolumeSpecName "kube-api-access-b4gzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:06 crc kubenswrapper[4744]: I1205 20:35:06.535637 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4gzp\" (UniqueName: \"kubernetes.io/projected/a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9-kube-api-access-b4gzp\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:06 crc kubenswrapper[4744]: I1205 20:35:06.535682 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:06 crc kubenswrapper[4744]: E1205 20:35:06.535723 4744 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:35:06 crc kubenswrapper[4744]: E1205 20:35:06.535785 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-config-data podName:cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36 nodeName:}" failed. No retries permitted until 2025-12-05 20:35:10.535769054 +0000 UTC m=+1480.765580422 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36") : secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:35:06 crc kubenswrapper[4744]: I1205 20:35:06.995816 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher489c-account-delete-94nmg" event={"ID":"a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9","Type":"ContainerDied","Data":"d8dcad3c9235c83c6302f4d9e680e0af9e1994e678b8152ef8c64c8d33019bd8"} Dec 05 20:35:06 crc kubenswrapper[4744]: I1205 20:35:06.996237 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8dcad3c9235c83c6302f4d9e680e0af9e1994e678b8152ef8c64c8d33019bd8" Dec 05 20:35:06 crc kubenswrapper[4744]: I1205 20:35:06.996256 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher489c-account-delete-94nmg" Dec 05 20:35:07 crc kubenswrapper[4744]: I1205 20:35:07.835160 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-5ht55"] Dec 05 20:35:07 crc kubenswrapper[4744]: I1205 20:35:07.843347 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-5ht55"] Dec 05 20:35:07 crc kubenswrapper[4744]: I1205 20:35:07.870247 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-489c-account-create-update-hh5w9"] Dec 05 20:35:07 crc kubenswrapper[4744]: I1205 20:35:07.877703 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher489c-account-delete-94nmg"] Dec 05 20:35:07 crc kubenswrapper[4744]: I1205 20:35:07.886762 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-489c-account-create-update-hh5w9"] Dec 05 20:35:07 crc kubenswrapper[4744]: I1205 20:35:07.900253 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher489c-account-delete-94nmg"] Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.013010 4744 generic.go:334] "Generic (PLEG): container finished" podID="fe602728-cd71-4a0d-8a05-08c295621691" containerID="7efa5908db0ba563ad2513dc7c8154f8c81e50ce36ccc7d7e1be0601cceb74b4" exitCode=0 Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.013103 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"fe602728-cd71-4a0d-8a05-08c295621691","Type":"ContainerDied","Data":"7efa5908db0ba563ad2513dc7c8154f8c81e50ce36ccc7d7e1be0601cceb74b4"} Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.020028 4744 generic.go:334] "Generic (PLEG): container finished" podID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerID="89c0f0f6d1336044c620f54079aa4b81b5eaa806807f8ceb58c77418c5afa7d5" exitCode=0 Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.020084 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3efc84d7-14e5-4b45-9913-5be849b305ee","Type":"ContainerDied","Data":"89c0f0f6d1336044c620f54079aa4b81b5eaa806807f8ceb58c77418c5afa7d5"} Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.020146 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3efc84d7-14e5-4b45-9913-5be849b305ee","Type":"ContainerDied","Data":"519a6f3dd788295135979850a4bcef8a87e03e268187622162b0d11f123ca828"} Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.020160 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="519a6f3dd788295135979850a4bcef8a87e03e268187622162b0d11f123ca828" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.030661 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.063970 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-sg-core-conf-yaml\") pod \"3efc84d7-14e5-4b45-9913-5be849b305ee\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.064064 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-config-data\") pod \"3efc84d7-14e5-4b45-9913-5be849b305ee\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.064127 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3efc84d7-14e5-4b45-9913-5be849b305ee-log-httpd\") pod \"3efc84d7-14e5-4b45-9913-5be849b305ee\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.064166 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk5zg\" (UniqueName: \"kubernetes.io/projected/3efc84d7-14e5-4b45-9913-5be849b305ee-kube-api-access-bk5zg\") pod \"3efc84d7-14e5-4b45-9913-5be849b305ee\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.064222 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-combined-ca-bundle\") pod \"3efc84d7-14e5-4b45-9913-5be849b305ee\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.064243 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-ceilometer-tls-certs\") pod \"3efc84d7-14e5-4b45-9913-5be849b305ee\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.064272 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3efc84d7-14e5-4b45-9913-5be849b305ee-run-httpd\") pod \"3efc84d7-14e5-4b45-9913-5be849b305ee\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.064305 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-scripts\") pod \"3efc84d7-14e5-4b45-9913-5be849b305ee\" (UID: \"3efc84d7-14e5-4b45-9913-5be849b305ee\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.065730 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3efc84d7-14e5-4b45-9913-5be849b305ee-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3efc84d7-14e5-4b45-9913-5be849b305ee" (UID: "3efc84d7-14e5-4b45-9913-5be849b305ee"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.066171 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3efc84d7-14e5-4b45-9913-5be849b305ee-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3efc84d7-14e5-4b45-9913-5be849b305ee" (UID: "3efc84d7-14e5-4b45-9913-5be849b305ee"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.084383 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3efc84d7-14e5-4b45-9913-5be849b305ee-kube-api-access-bk5zg" (OuterVolumeSpecName: "kube-api-access-bk5zg") pod "3efc84d7-14e5-4b45-9913-5be849b305ee" (UID: "3efc84d7-14e5-4b45-9913-5be849b305ee"). InnerVolumeSpecName "kube-api-access-bk5zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.085557 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-scripts" (OuterVolumeSpecName: "scripts") pod "3efc84d7-14e5-4b45-9913-5be849b305ee" (UID: "3efc84d7-14e5-4b45-9913-5be849b305ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.091966 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783b1f69-ef7e-4a6e-8bc6-27efb86e6fca" path="/var/lib/kubelet/pods/783b1f69-ef7e-4a6e-8bc6-27efb86e6fca/volumes" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.092781 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9" path="/var/lib/kubelet/pods/a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9/volumes" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.093223 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e" path="/var/lib/kubelet/pods/bd3f214e-fc00-43a7-b3ce-f266c8c8fb1e/volumes" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.094436 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3efc84d7-14e5-4b45-9913-5be849b305ee" (UID: "3efc84d7-14e5-4b45-9913-5be849b305ee"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.121657 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3efc84d7-14e5-4b45-9913-5be849b305ee" (UID: "3efc84d7-14e5-4b45-9913-5be849b305ee"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.140347 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3efc84d7-14e5-4b45-9913-5be849b305ee" (UID: "3efc84d7-14e5-4b45-9913-5be849b305ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.166694 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3efc84d7-14e5-4b45-9913-5be849b305ee-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.166724 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk5zg\" (UniqueName: \"kubernetes.io/projected/3efc84d7-14e5-4b45-9913-5be849b305ee-kube-api-access-bk5zg\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.166737 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.166750 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.166761 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3efc84d7-14e5-4b45-9913-5be849b305ee-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.166772 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.166782 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.167493 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.196808 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-config-data" (OuterVolumeSpecName: "config-data") pod "3efc84d7-14e5-4b45-9913-5be849b305ee" (UID: "3efc84d7-14e5-4b45-9913-5be849b305ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.267310 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe602728-cd71-4a0d-8a05-08c295621691-logs\") pod \"fe602728-cd71-4a0d-8a05-08c295621691\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.267407 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe602728-cd71-4a0d-8a05-08c295621691-combined-ca-bundle\") pod \"fe602728-cd71-4a0d-8a05-08c295621691\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.267511 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe602728-cd71-4a0d-8a05-08c295621691-config-data\") pod \"fe602728-cd71-4a0d-8a05-08c295621691\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.267620 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe602728-cd71-4a0d-8a05-08c295621691-logs" (OuterVolumeSpecName: "logs") pod "fe602728-cd71-4a0d-8a05-08c295621691" (UID: "fe602728-cd71-4a0d-8a05-08c295621691"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.267573 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjhgv\" (UniqueName: \"kubernetes.io/projected/fe602728-cd71-4a0d-8a05-08c295621691-kube-api-access-zjhgv\") pod \"fe602728-cd71-4a0d-8a05-08c295621691\" (UID: \"fe602728-cd71-4a0d-8a05-08c295621691\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.267974 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe602728-cd71-4a0d-8a05-08c295621691-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.267992 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3efc84d7-14e5-4b45-9913-5be849b305ee-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.270498 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe602728-cd71-4a0d-8a05-08c295621691-kube-api-access-zjhgv" (OuterVolumeSpecName: "kube-api-access-zjhgv") pod "fe602728-cd71-4a0d-8a05-08c295621691" (UID: "fe602728-cd71-4a0d-8a05-08c295621691"). InnerVolumeSpecName "kube-api-access-zjhgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.290844 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe602728-cd71-4a0d-8a05-08c295621691-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe602728-cd71-4a0d-8a05-08c295621691" (UID: "fe602728-cd71-4a0d-8a05-08c295621691"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.318163 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe602728-cd71-4a0d-8a05-08c295621691-config-data" (OuterVolumeSpecName: "config-data") pod "fe602728-cd71-4a0d-8a05-08c295621691" (UID: "fe602728-cd71-4a0d-8a05-08c295621691"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.369835 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe602728-cd71-4a0d-8a05-08c295621691-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.369889 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe602728-cd71-4a0d-8a05-08c295621691-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.369904 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjhgv\" (UniqueName: \"kubernetes.io/projected/fe602728-cd71-4a0d-8a05-08c295621691-kube-api-access-zjhgv\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.886461 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.983959 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-combined-ca-bundle\") pod \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.984026 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-logs\") pod \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.984057 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-config-data\") pod \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.984123 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-custom-prometheus-ca\") pod \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.984195 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7tdw\" (UniqueName: \"kubernetes.io/projected/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-kube-api-access-r7tdw\") pod \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\" (UID: \"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36\") " Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.985853 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-logs" (OuterVolumeSpecName: "logs") pod "cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36" (UID: "cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:08 crc kubenswrapper[4744]: I1205 20:35:08.987713 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-kube-api-access-r7tdw" (OuterVolumeSpecName: "kube-api-access-r7tdw") pod "cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36" (UID: "cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36"). InnerVolumeSpecName "kube-api-access-r7tdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.012966 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36" (UID: "cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.022402 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36" (UID: "cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.035979 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-config-data" (OuterVolumeSpecName: "config-data") pod "cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36" (UID: "cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.038934 4744 generic.go:334] "Generic (PLEG): container finished" podID="cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36" containerID="cbb592065d2e1d740cf22a92c9dfcf7515f80ac65142629522be89dad31a9167" exitCode=0 Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.039112 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36","Type":"ContainerDied","Data":"cbb592065d2e1d740cf22a92c9dfcf7515f80ac65142629522be89dad31a9167"} Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.039209 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36","Type":"ContainerDied","Data":"735b6933d79895c4701a7a64ee846ce68039828128de8e94e116d0bd41669fee"} Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.039285 4744 scope.go:117] "RemoveContainer" containerID="cbb592065d2e1d740cf22a92c9dfcf7515f80ac65142629522be89dad31a9167" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.039456 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.044277 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.044325 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.044371 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"fe602728-cd71-4a0d-8a05-08c295621691","Type":"ContainerDied","Data":"deaa3d25d8c22a8df2a1bb5a1bd0ed5e6ba1718e73f47662d49dfa1a6af3929d"} Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.099866 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.099910 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7tdw\" (UniqueName: \"kubernetes.io/projected/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-kube-api-access-r7tdw\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.099926 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.099943 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.099959 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.100177 4744 scope.go:117] "RemoveContainer" containerID="cbb592065d2e1d740cf22a92c9dfcf7515f80ac65142629522be89dad31a9167" Dec 05 20:35:09 crc kubenswrapper[4744]: E1205 20:35:09.100848 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb592065d2e1d740cf22a92c9dfcf7515f80ac65142629522be89dad31a9167\": container with ID starting with cbb592065d2e1d740cf22a92c9dfcf7515f80ac65142629522be89dad31a9167 not found: ID does not exist" containerID="cbb592065d2e1d740cf22a92c9dfcf7515f80ac65142629522be89dad31a9167" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.100880 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb592065d2e1d740cf22a92c9dfcf7515f80ac65142629522be89dad31a9167"} err="failed to get container status \"cbb592065d2e1d740cf22a92c9dfcf7515f80ac65142629522be89dad31a9167\": rpc error: code = NotFound desc = could not find container \"cbb592065d2e1d740cf22a92c9dfcf7515f80ac65142629522be89dad31a9167\": container with ID starting with cbb592065d2e1d740cf22a92c9dfcf7515f80ac65142629522be89dad31a9167 not found: ID does not exist" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.100899 4744 scope.go:117] "RemoveContainer" containerID="7efa5908db0ba563ad2513dc7c8154f8c81e50ce36ccc7d7e1be0601cceb74b4" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.102598 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.115486 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.130777 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.139881 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.150087 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.159067 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.172452 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:09 crc kubenswrapper[4744]: E1205 20:35:09.173189 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9" containerName="mariadb-account-delete" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173215 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9" containerName="mariadb-account-delete" Dec 05 20:35:09 crc kubenswrapper[4744]: E1205 20:35:09.173237 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="proxy-httpd" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173246 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="proxy-httpd" Dec 05 20:35:09 crc kubenswrapper[4744]: E1205 20:35:09.173261 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe602728-cd71-4a0d-8a05-08c295621691" containerName="watcher-applier" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173271 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe602728-cd71-4a0d-8a05-08c295621691" containerName="watcher-applier" Dec 05 20:35:09 crc kubenswrapper[4744]: E1205 20:35:09.173287 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="ceilometer-notification-agent" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173313 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="ceilometer-notification-agent" Dec 05 20:35:09 crc kubenswrapper[4744]: E1205 20:35:09.173327 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="ceilometer-central-agent" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173335 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="ceilometer-central-agent" Dec 05 20:35:09 crc kubenswrapper[4744]: E1205 20:35:09.173345 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8397b5a7-4151-4c3b-a1c9-199df35771d9" containerName="watcher-kuttl-api-log" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173354 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8397b5a7-4151-4c3b-a1c9-199df35771d9" containerName="watcher-kuttl-api-log" Dec 05 20:35:09 crc kubenswrapper[4744]: E1205 20:35:09.173367 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8397b5a7-4151-4c3b-a1c9-199df35771d9" containerName="watcher-api" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173376 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8397b5a7-4151-4c3b-a1c9-199df35771d9" containerName="watcher-api" Dec 05 20:35:09 crc kubenswrapper[4744]: E1205 20:35:09.173392 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="sg-core" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173399 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="sg-core" Dec 05 20:35:09 crc kubenswrapper[4744]: E1205 20:35:09.173411 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36" containerName="watcher-decision-engine" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173418 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36" containerName="watcher-decision-engine" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173612 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="ceilometer-notification-agent" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173632 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="sg-core" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173643 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="proxy-httpd" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173652 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe602728-cd71-4a0d-8a05-08c295621691" containerName="watcher-applier" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173661 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8397b5a7-4151-4c3b-a1c9-199df35771d9" containerName="watcher-kuttl-api-log" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173674 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36" containerName="watcher-decision-engine" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173688 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" containerName="ceilometer-central-agent" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173702 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6fbdeb6-0aa3-4191-b1a3-1f61d99e38a9" containerName="mariadb-account-delete" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.173714 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8397b5a7-4151-4c3b-a1c9-199df35771d9" containerName="watcher-api" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.175599 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.178279 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.178686 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.179567 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.189081 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.302781 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-config-data\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.303056 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.303084 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.303098 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.303142 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45503749-4795-44ec-9172-565149962da5-log-httpd\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.303160 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkj2m\" (UniqueName: \"kubernetes.io/projected/45503749-4795-44ec-9172-565149962da5-kube-api-access-zkj2m\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.303179 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-scripts\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.303193 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45503749-4795-44ec-9172-565149962da5-run-httpd\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.404619 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.404868 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.404971 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.405156 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45503749-4795-44ec-9172-565149962da5-log-httpd\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.405227 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkj2m\" (UniqueName: \"kubernetes.io/projected/45503749-4795-44ec-9172-565149962da5-kube-api-access-zkj2m\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.405309 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-scripts\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.405792 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45503749-4795-44ec-9172-565149962da5-run-httpd\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.405817 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45503749-4795-44ec-9172-565149962da5-log-httpd\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.405983 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-config-data\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.406164 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45503749-4795-44ec-9172-565149962da5-run-httpd\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.409068 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.409209 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-config-data\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.409899 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.410226 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.410386 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-scripts\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.428658 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkj2m\" (UniqueName: \"kubernetes.io/projected/45503749-4795-44ec-9172-565149962da5-kube-api-access-zkj2m\") pod \"ceilometer-0\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.506119 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.722717 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-2h2w9"] Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.724435 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-2h2w9" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.734460 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-2h2w9"] Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.745224 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-e303-account-create-update-vltbz"] Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.746608 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-e303-account-create-update-vltbz" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.752275 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.754875 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-e303-account-create-update-vltbz"] Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.812912 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2jq2\" (UniqueName: \"kubernetes.io/projected/a99a6131-64b7-4918-8054-203a827907cc-kube-api-access-l2jq2\") pod \"watcher-db-create-2h2w9\" (UID: \"a99a6131-64b7-4918-8054-203a827907cc\") " pod="watcher-kuttl-default/watcher-db-create-2h2w9" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.812952 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a99a6131-64b7-4918-8054-203a827907cc-operator-scripts\") pod \"watcher-db-create-2h2w9\" (UID: \"a99a6131-64b7-4918-8054-203a827907cc\") " pod="watcher-kuttl-default/watcher-db-create-2h2w9" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.813002 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpm6p\" (UniqueName: \"kubernetes.io/projected/b910857e-8f65-450d-8d46-d14e45db7cf4-kube-api-access-bpm6p\") pod \"watcher-e303-account-create-update-vltbz\" (UID: \"b910857e-8f65-450d-8d46-d14e45db7cf4\") " pod="watcher-kuttl-default/watcher-e303-account-create-update-vltbz" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.813045 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b910857e-8f65-450d-8d46-d14e45db7cf4-operator-scripts\") pod \"watcher-e303-account-create-update-vltbz\" (UID: \"b910857e-8f65-450d-8d46-d14e45db7cf4\") " pod="watcher-kuttl-default/watcher-e303-account-create-update-vltbz" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.817739 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:09 crc kubenswrapper[4744]: W1205 20:35:09.825203 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45503749_4795_44ec_9172_565149962da5.slice/crio-10025c21f84e214ccbd3195adb85cf6b076c3bdc759a5e0b05d315c959ba46d5 WatchSource:0}: Error finding container 10025c21f84e214ccbd3195adb85cf6b076c3bdc759a5e0b05d315c959ba46d5: Status 404 returned error can't find the container with id 10025c21f84e214ccbd3195adb85cf6b076c3bdc759a5e0b05d315c959ba46d5 Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.914359 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpm6p\" (UniqueName: \"kubernetes.io/projected/b910857e-8f65-450d-8d46-d14e45db7cf4-kube-api-access-bpm6p\") pod \"watcher-e303-account-create-update-vltbz\" (UID: \"b910857e-8f65-450d-8d46-d14e45db7cf4\") " pod="watcher-kuttl-default/watcher-e303-account-create-update-vltbz" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.914442 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b910857e-8f65-450d-8d46-d14e45db7cf4-operator-scripts\") pod \"watcher-e303-account-create-update-vltbz\" (UID: \"b910857e-8f65-450d-8d46-d14e45db7cf4\") " pod="watcher-kuttl-default/watcher-e303-account-create-update-vltbz" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.914566 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2jq2\" (UniqueName: \"kubernetes.io/projected/a99a6131-64b7-4918-8054-203a827907cc-kube-api-access-l2jq2\") pod \"watcher-db-create-2h2w9\" (UID: \"a99a6131-64b7-4918-8054-203a827907cc\") " pod="watcher-kuttl-default/watcher-db-create-2h2w9" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.914593 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a99a6131-64b7-4918-8054-203a827907cc-operator-scripts\") pod \"watcher-db-create-2h2w9\" (UID: \"a99a6131-64b7-4918-8054-203a827907cc\") " pod="watcher-kuttl-default/watcher-db-create-2h2w9" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.915353 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a99a6131-64b7-4918-8054-203a827907cc-operator-scripts\") pod \"watcher-db-create-2h2w9\" (UID: \"a99a6131-64b7-4918-8054-203a827907cc\") " pod="watcher-kuttl-default/watcher-db-create-2h2w9" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.915374 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b910857e-8f65-450d-8d46-d14e45db7cf4-operator-scripts\") pod \"watcher-e303-account-create-update-vltbz\" (UID: \"b910857e-8f65-450d-8d46-d14e45db7cf4\") " pod="watcher-kuttl-default/watcher-e303-account-create-update-vltbz" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.939243 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2jq2\" (UniqueName: \"kubernetes.io/projected/a99a6131-64b7-4918-8054-203a827907cc-kube-api-access-l2jq2\") pod \"watcher-db-create-2h2w9\" (UID: \"a99a6131-64b7-4918-8054-203a827907cc\") " pod="watcher-kuttl-default/watcher-db-create-2h2w9" Dec 05 20:35:09 crc kubenswrapper[4744]: I1205 20:35:09.939934 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpm6p\" (UniqueName: \"kubernetes.io/projected/b910857e-8f65-450d-8d46-d14e45db7cf4-kube-api-access-bpm6p\") pod \"watcher-e303-account-create-update-vltbz\" (UID: \"b910857e-8f65-450d-8d46-d14e45db7cf4\") " pod="watcher-kuttl-default/watcher-e303-account-create-update-vltbz" Dec 05 20:35:10 crc kubenswrapper[4744]: I1205 20:35:10.050398 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-2h2w9" Dec 05 20:35:10 crc kubenswrapper[4744]: I1205 20:35:10.058792 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"45503749-4795-44ec-9172-565149962da5","Type":"ContainerStarted","Data":"10025c21f84e214ccbd3195adb85cf6b076c3bdc759a5e0b05d315c959ba46d5"} Dec 05 20:35:10 crc kubenswrapper[4744]: I1205 20:35:10.064434 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-e303-account-create-update-vltbz" Dec 05 20:35:10 crc kubenswrapper[4744]: I1205 20:35:10.091533 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3efc84d7-14e5-4b45-9913-5be849b305ee" path="/var/lib/kubelet/pods/3efc84d7-14e5-4b45-9913-5be849b305ee/volumes" Dec 05 20:35:10 crc kubenswrapper[4744]: I1205 20:35:10.092577 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36" path="/var/lib/kubelet/pods/cc5f0e4a-62fa-4b20-bb8f-4de79b50dc36/volumes" Dec 05 20:35:10 crc kubenswrapper[4744]: I1205 20:35:10.093066 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe602728-cd71-4a0d-8a05-08c295621691" path="/var/lib/kubelet/pods/fe602728-cd71-4a0d-8a05-08c295621691/volumes" Dec 05 20:35:10 crc kubenswrapper[4744]: I1205 20:35:10.549963 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-2h2w9"] Dec 05 20:35:10 crc kubenswrapper[4744]: W1205 20:35:10.550366 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda99a6131_64b7_4918_8054_203a827907cc.slice/crio-e2b1beb868f0e50b3ef4ec194e8cbdedd2f7637ececb89bb964e4c28344ecd8d WatchSource:0}: Error finding container e2b1beb868f0e50b3ef4ec194e8cbdedd2f7637ececb89bb964e4c28344ecd8d: Status 404 returned error can't find the container with id e2b1beb868f0e50b3ef4ec194e8cbdedd2f7637ececb89bb964e4c28344ecd8d Dec 05 20:35:10 crc kubenswrapper[4744]: I1205 20:35:10.614938 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-e303-account-create-update-vltbz"] Dec 05 20:35:10 crc kubenswrapper[4744]: W1205 20:35:10.617403 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb910857e_8f65_450d_8d46_d14e45db7cf4.slice/crio-caeb887909b832a4152a9d62ef02e985b61ba1240d1e8d141fadaf9b6e9b7791 WatchSource:0}: Error finding container caeb887909b832a4152a9d62ef02e985b61ba1240d1e8d141fadaf9b6e9b7791: Status 404 returned error can't find the container with id caeb887909b832a4152a9d62ef02e985b61ba1240d1e8d141fadaf9b6e9b7791 Dec 05 20:35:11 crc kubenswrapper[4744]: I1205 20:35:11.076862 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"45503749-4795-44ec-9172-565149962da5","Type":"ContainerStarted","Data":"3861621ea0a0644f5ea217c77e8f7e66056d6459ade6fa5bc18a7c4e4cff7cbd"} Dec 05 20:35:11 crc kubenswrapper[4744]: I1205 20:35:11.082560 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-2h2w9" event={"ID":"a99a6131-64b7-4918-8054-203a827907cc","Type":"ContainerStarted","Data":"9cf7325abf7b894ef7e81fd88034d3caacae387c0810a43d85cc18f0a1849d2b"} Dec 05 20:35:11 crc kubenswrapper[4744]: I1205 20:35:11.082609 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-2h2w9" event={"ID":"a99a6131-64b7-4918-8054-203a827907cc","Type":"ContainerStarted","Data":"e2b1beb868f0e50b3ef4ec194e8cbdedd2f7637ececb89bb964e4c28344ecd8d"} Dec 05 20:35:11 crc kubenswrapper[4744]: I1205 20:35:11.084505 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-e303-account-create-update-vltbz" event={"ID":"b910857e-8f65-450d-8d46-d14e45db7cf4","Type":"ContainerStarted","Data":"1ee8afc9cfb0dcd9fbc94f6649cd5d76c15e557d931027df25758c40c4e731cb"} Dec 05 20:35:11 crc kubenswrapper[4744]: I1205 20:35:11.084547 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-e303-account-create-update-vltbz" event={"ID":"b910857e-8f65-450d-8d46-d14e45db7cf4","Type":"ContainerStarted","Data":"caeb887909b832a4152a9d62ef02e985b61ba1240d1e8d141fadaf9b6e9b7791"} Dec 05 20:35:11 crc kubenswrapper[4744]: I1205 20:35:11.118945 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-db-create-2h2w9" podStartSLOduration=2.118926157 podStartE2EDuration="2.118926157s" podCreationTimestamp="2025-12-05 20:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:11.108980553 +0000 UTC m=+1481.338791931" watchObservedRunningTime="2025-12-05 20:35:11.118926157 +0000 UTC m=+1481.348737525" Dec 05 20:35:11 crc kubenswrapper[4744]: I1205 20:35:11.132486 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-e303-account-create-update-vltbz" podStartSLOduration=2.13247118 podStartE2EDuration="2.13247118s" podCreationTimestamp="2025-12-05 20:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:11.128612565 +0000 UTC m=+1481.358423933" watchObservedRunningTime="2025-12-05 20:35:11.13247118 +0000 UTC m=+1481.362282548" Dec 05 20:35:12 crc kubenswrapper[4744]: I1205 20:35:12.099088 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"45503749-4795-44ec-9172-565149962da5","Type":"ContainerStarted","Data":"4c06099961a50968883f647a1e3c92069ad610c8f0970debef2f2c4dc1283184"} Dec 05 20:35:12 crc kubenswrapper[4744]: I1205 20:35:12.099513 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"45503749-4795-44ec-9172-565149962da5","Type":"ContainerStarted","Data":"e852bbc9521245d62ca37c3281b23b1216fdf473d30a2f374be1c046ab15aa00"} Dec 05 20:35:12 crc kubenswrapper[4744]: I1205 20:35:12.100365 4744 generic.go:334] "Generic (PLEG): container finished" podID="a99a6131-64b7-4918-8054-203a827907cc" containerID="9cf7325abf7b894ef7e81fd88034d3caacae387c0810a43d85cc18f0a1849d2b" exitCode=0 Dec 05 20:35:12 crc kubenswrapper[4744]: I1205 20:35:12.100433 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-2h2w9" event={"ID":"a99a6131-64b7-4918-8054-203a827907cc","Type":"ContainerDied","Data":"9cf7325abf7b894ef7e81fd88034d3caacae387c0810a43d85cc18f0a1849d2b"} Dec 05 20:35:12 crc kubenswrapper[4744]: I1205 20:35:12.102261 4744 generic.go:334] "Generic (PLEG): container finished" podID="b910857e-8f65-450d-8d46-d14e45db7cf4" containerID="1ee8afc9cfb0dcd9fbc94f6649cd5d76c15e557d931027df25758c40c4e731cb" exitCode=0 Dec 05 20:35:12 crc kubenswrapper[4744]: I1205 20:35:12.102353 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-e303-account-create-update-vltbz" event={"ID":"b910857e-8f65-450d-8d46-d14e45db7cf4","Type":"ContainerDied","Data":"1ee8afc9cfb0dcd9fbc94f6649cd5d76c15e557d931027df25758c40c4e731cb"} Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.115311 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"45503749-4795-44ec-9172-565149962da5","Type":"ContainerStarted","Data":"0574b8b042f640d5d4db6499b87c41d8148a3cec3b3cc5f07ba8bc9d79d7dcd4"} Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.115879 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.155371 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.2876613749999999 podStartE2EDuration="4.155347552s" podCreationTimestamp="2025-12-05 20:35:09 +0000 UTC" firstStartedPulling="2025-12-05 20:35:09.828678059 +0000 UTC m=+1480.058489427" lastFinishedPulling="2025-12-05 20:35:12.696364236 +0000 UTC m=+1482.926175604" observedRunningTime="2025-12-05 20:35:13.143193583 +0000 UTC m=+1483.373004951" watchObservedRunningTime="2025-12-05 20:35:13.155347552 +0000 UTC m=+1483.385158920" Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.589577 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-2h2w9" Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.595224 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-e303-account-create-update-vltbz" Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.667986 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpm6p\" (UniqueName: \"kubernetes.io/projected/b910857e-8f65-450d-8d46-d14e45db7cf4-kube-api-access-bpm6p\") pod \"b910857e-8f65-450d-8d46-d14e45db7cf4\" (UID: \"b910857e-8f65-450d-8d46-d14e45db7cf4\") " Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.668055 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2jq2\" (UniqueName: \"kubernetes.io/projected/a99a6131-64b7-4918-8054-203a827907cc-kube-api-access-l2jq2\") pod \"a99a6131-64b7-4918-8054-203a827907cc\" (UID: \"a99a6131-64b7-4918-8054-203a827907cc\") " Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.668174 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a99a6131-64b7-4918-8054-203a827907cc-operator-scripts\") pod \"a99a6131-64b7-4918-8054-203a827907cc\" (UID: \"a99a6131-64b7-4918-8054-203a827907cc\") " Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.668227 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b910857e-8f65-450d-8d46-d14e45db7cf4-operator-scripts\") pod \"b910857e-8f65-450d-8d46-d14e45db7cf4\" (UID: \"b910857e-8f65-450d-8d46-d14e45db7cf4\") " Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.669160 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b910857e-8f65-450d-8d46-d14e45db7cf4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b910857e-8f65-450d-8d46-d14e45db7cf4" (UID: "b910857e-8f65-450d-8d46-d14e45db7cf4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.669423 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a99a6131-64b7-4918-8054-203a827907cc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a99a6131-64b7-4918-8054-203a827907cc" (UID: "a99a6131-64b7-4918-8054-203a827907cc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.671762 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a99a6131-64b7-4918-8054-203a827907cc-kube-api-access-l2jq2" (OuterVolumeSpecName: "kube-api-access-l2jq2") pod "a99a6131-64b7-4918-8054-203a827907cc" (UID: "a99a6131-64b7-4918-8054-203a827907cc"). InnerVolumeSpecName "kube-api-access-l2jq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.672260 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b910857e-8f65-450d-8d46-d14e45db7cf4-kube-api-access-bpm6p" (OuterVolumeSpecName: "kube-api-access-bpm6p") pod "b910857e-8f65-450d-8d46-d14e45db7cf4" (UID: "b910857e-8f65-450d-8d46-d14e45db7cf4"). InnerVolumeSpecName "kube-api-access-bpm6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.770685 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a99a6131-64b7-4918-8054-203a827907cc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.770743 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b910857e-8f65-450d-8d46-d14e45db7cf4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.770759 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpm6p\" (UniqueName: \"kubernetes.io/projected/b910857e-8f65-450d-8d46-d14e45db7cf4-kube-api-access-bpm6p\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:13 crc kubenswrapper[4744]: I1205 20:35:13.770775 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2jq2\" (UniqueName: \"kubernetes.io/projected/a99a6131-64b7-4918-8054-203a827907cc-kube-api-access-l2jq2\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:14 crc kubenswrapper[4744]: I1205 20:35:14.128362 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-2h2w9" event={"ID":"a99a6131-64b7-4918-8054-203a827907cc","Type":"ContainerDied","Data":"e2b1beb868f0e50b3ef4ec194e8cbdedd2f7637ececb89bb964e4c28344ecd8d"} Dec 05 20:35:14 crc kubenswrapper[4744]: I1205 20:35:14.128400 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2b1beb868f0e50b3ef4ec194e8cbdedd2f7637ececb89bb964e4c28344ecd8d" Dec 05 20:35:14 crc kubenswrapper[4744]: I1205 20:35:14.128414 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-2h2w9" Dec 05 20:35:14 crc kubenswrapper[4744]: I1205 20:35:14.131835 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-e303-account-create-update-vltbz" Dec 05 20:35:14 crc kubenswrapper[4744]: I1205 20:35:14.132157 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-e303-account-create-update-vltbz" event={"ID":"b910857e-8f65-450d-8d46-d14e45db7cf4","Type":"ContainerDied","Data":"caeb887909b832a4152a9d62ef02e985b61ba1240d1e8d141fadaf9b6e9b7791"} Dec 05 20:35:14 crc kubenswrapper[4744]: I1205 20:35:14.132266 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caeb887909b832a4152a9d62ef02e985b61ba1240d1e8d141fadaf9b6e9b7791" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.077250 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-n95fx"] Dec 05 20:35:15 crc kubenswrapper[4744]: E1205 20:35:15.078106 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99a6131-64b7-4918-8054-203a827907cc" containerName="mariadb-database-create" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.078118 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99a6131-64b7-4918-8054-203a827907cc" containerName="mariadb-database-create" Dec 05 20:35:15 crc kubenswrapper[4744]: E1205 20:35:15.078130 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b910857e-8f65-450d-8d46-d14e45db7cf4" containerName="mariadb-account-create-update" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.078136 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b910857e-8f65-450d-8d46-d14e45db7cf4" containerName="mariadb-account-create-update" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.078349 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b910857e-8f65-450d-8d46-d14e45db7cf4" containerName="mariadb-account-create-update" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.078373 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a99a6131-64b7-4918-8054-203a827907cc" containerName="mariadb-database-create" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.078939 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.081883 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-2df5p" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.084008 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.092126 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-n95fx"] Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.093054 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-n95fx\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.093286 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-config-data\") pod \"watcher-kuttl-db-sync-n95fx\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.093383 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wznz9\" (UniqueName: \"kubernetes.io/projected/042ae314-8ed7-494e-bb91-ac44f4f12097-kube-api-access-wznz9\") pod \"watcher-kuttl-db-sync-n95fx\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.093409 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-db-sync-config-data\") pod \"watcher-kuttl-db-sync-n95fx\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.194210 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-config-data\") pod \"watcher-kuttl-db-sync-n95fx\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.194860 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wznz9\" (UniqueName: \"kubernetes.io/projected/042ae314-8ed7-494e-bb91-ac44f4f12097-kube-api-access-wznz9\") pod \"watcher-kuttl-db-sync-n95fx\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.194886 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-db-sync-config-data\") pod \"watcher-kuttl-db-sync-n95fx\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.194914 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-n95fx\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.199078 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-config-data\") pod \"watcher-kuttl-db-sync-n95fx\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.200060 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-n95fx\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.200770 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-db-sync-config-data\") pod \"watcher-kuttl-db-sync-n95fx\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.211902 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wznz9\" (UniqueName: \"kubernetes.io/projected/042ae314-8ed7-494e-bb91-ac44f4f12097-kube-api-access-wznz9\") pod \"watcher-kuttl-db-sync-n95fx\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.396275 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:15 crc kubenswrapper[4744]: I1205 20:35:15.904343 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-n95fx"] Dec 05 20:35:15 crc kubenswrapper[4744]: W1205 20:35:15.911462 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod042ae314_8ed7_494e_bb91_ac44f4f12097.slice/crio-9d86588497065c69f5b8f884e9bd37b658eaf3b01689afc57e5fe59631449e6f WatchSource:0}: Error finding container 9d86588497065c69f5b8f884e9bd37b658eaf3b01689afc57e5fe59631449e6f: Status 404 returned error can't find the container with id 9d86588497065c69f5b8f884e9bd37b658eaf3b01689afc57e5fe59631449e6f Dec 05 20:35:16 crc kubenswrapper[4744]: I1205 20:35:16.146899 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" event={"ID":"042ae314-8ed7-494e-bb91-ac44f4f12097","Type":"ContainerStarted","Data":"eadbfc5ee2005ba399e4d0e0307660149410178262f88158328a83cae16e4612"} Dec 05 20:35:16 crc kubenswrapper[4744]: I1205 20:35:16.146947 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" event={"ID":"042ae314-8ed7-494e-bb91-ac44f4f12097","Type":"ContainerStarted","Data":"9d86588497065c69f5b8f884e9bd37b658eaf3b01689afc57e5fe59631449e6f"} Dec 05 20:35:16 crc kubenswrapper[4744]: I1205 20:35:16.166763 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" podStartSLOduration=1.166742625 podStartE2EDuration="1.166742625s" podCreationTimestamp="2025-12-05 20:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:16.160475412 +0000 UTC m=+1486.390286790" watchObservedRunningTime="2025-12-05 20:35:16.166742625 +0000 UTC m=+1486.396554003" Dec 05 20:35:19 crc kubenswrapper[4744]: I1205 20:35:19.171442 4744 generic.go:334] "Generic (PLEG): container finished" podID="042ae314-8ed7-494e-bb91-ac44f4f12097" containerID="eadbfc5ee2005ba399e4d0e0307660149410178262f88158328a83cae16e4612" exitCode=0 Dec 05 20:35:19 crc kubenswrapper[4744]: I1205 20:35:19.171524 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" event={"ID":"042ae314-8ed7-494e-bb91-ac44f4f12097","Type":"ContainerDied","Data":"eadbfc5ee2005ba399e4d0e0307660149410178262f88158328a83cae16e4612"} Dec 05 20:35:19 crc kubenswrapper[4744]: I1205 20:35:19.806368 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:35:19 crc kubenswrapper[4744]: I1205 20:35:19.806759 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:35:19 crc kubenswrapper[4744]: I1205 20:35:19.806833 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:35:19 crc kubenswrapper[4744]: I1205 20:35:19.807801 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a41e1afd711ac794442abac71b281086d9f7a27b011779b1513b0d659dd4277c"} pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:35:19 crc kubenswrapper[4744]: I1205 20:35:19.807905 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" containerID="cri-o://a41e1afd711ac794442abac71b281086d9f7a27b011779b1513b0d659dd4277c" gracePeriod=600 Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.182485 4744 generic.go:334] "Generic (PLEG): container finished" podID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerID="a41e1afd711ac794442abac71b281086d9f7a27b011779b1513b0d659dd4277c" exitCode=0 Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.182559 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerDied","Data":"a41e1afd711ac794442abac71b281086d9f7a27b011779b1513b0d659dd4277c"} Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.182593 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerStarted","Data":"0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921"} Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.182608 4744 scope.go:117] "RemoveContainer" containerID="52a7f6284055fc7f936355b093cc061c593ac88f5c9486e893ae19c6a9299d8d" Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.483798 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.598584 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-config-data\") pod \"042ae314-8ed7-494e-bb91-ac44f4f12097\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.598666 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-combined-ca-bundle\") pod \"042ae314-8ed7-494e-bb91-ac44f4f12097\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.598699 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-db-sync-config-data\") pod \"042ae314-8ed7-494e-bb91-ac44f4f12097\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.598801 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wznz9\" (UniqueName: \"kubernetes.io/projected/042ae314-8ed7-494e-bb91-ac44f4f12097-kube-api-access-wznz9\") pod \"042ae314-8ed7-494e-bb91-ac44f4f12097\" (UID: \"042ae314-8ed7-494e-bb91-ac44f4f12097\") " Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.603631 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/042ae314-8ed7-494e-bb91-ac44f4f12097-kube-api-access-wznz9" (OuterVolumeSpecName: "kube-api-access-wznz9") pod "042ae314-8ed7-494e-bb91-ac44f4f12097" (UID: "042ae314-8ed7-494e-bb91-ac44f4f12097"). InnerVolumeSpecName "kube-api-access-wznz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.603740 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "042ae314-8ed7-494e-bb91-ac44f4f12097" (UID: "042ae314-8ed7-494e-bb91-ac44f4f12097"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.620232 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "042ae314-8ed7-494e-bb91-ac44f4f12097" (UID: "042ae314-8ed7-494e-bb91-ac44f4f12097"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.642149 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-config-data" (OuterVolumeSpecName: "config-data") pod "042ae314-8ed7-494e-bb91-ac44f4f12097" (UID: "042ae314-8ed7-494e-bb91-ac44f4f12097"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.699983 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.700012 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.700025 4744 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/042ae314-8ed7-494e-bb91-ac44f4f12097-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:20 crc kubenswrapper[4744]: I1205 20:35:20.700037 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wznz9\" (UniqueName: \"kubernetes.io/projected/042ae314-8ed7-494e-bb91-ac44f4f12097-kube-api-access-wznz9\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.195210 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" event={"ID":"042ae314-8ed7-494e-bb91-ac44f4f12097","Type":"ContainerDied","Data":"9d86588497065c69f5b8f884e9bd37b658eaf3b01689afc57e5fe59631449e6f"} Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.195234 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-n95fx" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.195515 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d86588497065c69f5b8f884e9bd37b658eaf3b01689afc57e5fe59631449e6f" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.521804 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:35:21 crc kubenswrapper[4744]: E1205 20:35:21.522258 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="042ae314-8ed7-494e-bb91-ac44f4f12097" containerName="watcher-kuttl-db-sync" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.522284 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="042ae314-8ed7-494e-bb91-ac44f4f12097" containerName="watcher-kuttl-db-sync" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.522623 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="042ae314-8ed7-494e-bb91-ac44f4f12097" containerName="watcher-kuttl-db-sync" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.523409 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.524974 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.527464 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-2df5p" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.529244 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.599447 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.601480 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.603640 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.609577 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.619380 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.620090 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/545e6f9e-ddf0-43e6-b712-897b54463135-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.620195 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.620232 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.620403 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7mc8\" (UniqueName: \"kubernetes.io/projected/545e6f9e-ddf0-43e6-b712-897b54463135-kube-api-access-s7mc8\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.620488 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.621077 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.635806 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.637305 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.645570 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.666969 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.724933 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/545e6f9e-ddf0-43e6-b712-897b54463135-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.724994 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.725026 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.725052 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd44a318-1763-4262-98ba-76c75ca8154b-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.725079 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.725097 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.725111 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.725123 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d5d7e4-91d1-470a-aa50-c57c025983ad-logs\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.725144 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sfbf\" (UniqueName: \"kubernetes.io/projected/c5d5d7e4-91d1-470a-aa50-c57c025983ad-kube-api-access-2sfbf\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.725219 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7mc8\" (UniqueName: \"kubernetes.io/projected/545e6f9e-ddf0-43e6-b712-897b54463135-kube-api-access-s7mc8\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.725236 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.725256 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.725307 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps8rp\" (UniqueName: \"kubernetes.io/projected/bd44a318-1763-4262-98ba-76c75ca8154b-kube-api-access-ps8rp\") pod \"watcher-kuttl-applier-0\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.725327 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd44a318-1763-4262-98ba-76c75ca8154b-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.725375 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.725412 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd44a318-1763-4262-98ba-76c75ca8154b-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.725503 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/545e6f9e-ddf0-43e6-b712-897b54463135-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.732007 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.732083 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.732610 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.742060 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7mc8\" (UniqueName: \"kubernetes.io/projected/545e6f9e-ddf0-43e6-b712-897b54463135-kube-api-access-s7mc8\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.827024 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd44a318-1763-4262-98ba-76c75ca8154b-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.827078 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.827096 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.827113 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.827126 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d5d7e4-91d1-470a-aa50-c57c025983ad-logs\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.827148 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sfbf\" (UniqueName: \"kubernetes.io/projected/c5d5d7e4-91d1-470a-aa50-c57c025983ad-kube-api-access-2sfbf\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.827185 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.827201 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.827218 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps8rp\" (UniqueName: \"kubernetes.io/projected/bd44a318-1763-4262-98ba-76c75ca8154b-kube-api-access-ps8rp\") pod \"watcher-kuttl-applier-0\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.827236 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd44a318-1763-4262-98ba-76c75ca8154b-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.827263 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd44a318-1763-4262-98ba-76c75ca8154b-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.828392 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d5d7e4-91d1-470a-aa50-c57c025983ad-logs\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.829534 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd44a318-1763-4262-98ba-76c75ca8154b-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.832363 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.832448 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.833173 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd44a318-1763-4262-98ba-76c75ca8154b-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.833269 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.833811 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd44a318-1763-4262-98ba-76c75ca8154b-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.834276 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.837866 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.840378 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.846901 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sfbf\" (UniqueName: \"kubernetes.io/projected/c5d5d7e4-91d1-470a-aa50-c57c025983ad-kube-api-access-2sfbf\") pod \"watcher-kuttl-api-0\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.856805 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps8rp\" (UniqueName: \"kubernetes.io/projected/bd44a318-1763-4262-98ba-76c75ca8154b-kube-api-access-ps8rp\") pod \"watcher-kuttl-applier-0\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.923554 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:21 crc kubenswrapper[4744]: I1205 20:35:21.971707 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:22 crc kubenswrapper[4744]: I1205 20:35:22.369584 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:35:22 crc kubenswrapper[4744]: I1205 20:35:22.496532 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:35:22 crc kubenswrapper[4744]: I1205 20:35:22.536762 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:35:23 crc kubenswrapper[4744]: I1205 20:35:23.218438 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"545e6f9e-ddf0-43e6-b712-897b54463135","Type":"ContainerStarted","Data":"a4688663e4fc5efe4bcd8bb454f4e6c054692cbbb8a506ebaf2deb646456aa4f"} Dec 05 20:35:23 crc kubenswrapper[4744]: I1205 20:35:23.218936 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"545e6f9e-ddf0-43e6-b712-897b54463135","Type":"ContainerStarted","Data":"5a1a219dea507276d6c1e64bb0ff912b906791623405ea21cba5c12ec4f72587"} Dec 05 20:35:23 crc kubenswrapper[4744]: I1205 20:35:23.221584 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"bd44a318-1763-4262-98ba-76c75ca8154b","Type":"ContainerStarted","Data":"c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428"} Dec 05 20:35:23 crc kubenswrapper[4744]: I1205 20:35:23.221670 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"bd44a318-1763-4262-98ba-76c75ca8154b","Type":"ContainerStarted","Data":"4c82fc08ceaeb43fe94b0617b0504ea9ba708b945da5bd799d77cce32e1cc90a"} Dec 05 20:35:23 crc kubenswrapper[4744]: I1205 20:35:23.228564 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c5d5d7e4-91d1-470a-aa50-c57c025983ad","Type":"ContainerStarted","Data":"ce65cec1fa37db55a60b2638bad6fb0cd02b0465a92603caa1f76ec5679b8d51"} Dec 05 20:35:23 crc kubenswrapper[4744]: I1205 20:35:23.228599 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c5d5d7e4-91d1-470a-aa50-c57c025983ad","Type":"ContainerStarted","Data":"c38c7b2ebb0364adc4c8f920e538c5e5d2fdfc4b1af99e4251da581a9fe188e2"} Dec 05 20:35:23 crc kubenswrapper[4744]: I1205 20:35:23.228611 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c5d5d7e4-91d1-470a-aa50-c57c025983ad","Type":"ContainerStarted","Data":"3a1c1169deaca1dbb61659ea3373827d54545720db271fb6ddd00d9fb3f728a5"} Dec 05 20:35:23 crc kubenswrapper[4744]: I1205 20:35:23.228754 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:23 crc kubenswrapper[4744]: I1205 20:35:23.249941 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.249924991 podStartE2EDuration="2.249924991s" podCreationTimestamp="2025-12-05 20:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:23.245146965 +0000 UTC m=+1493.474958333" watchObservedRunningTime="2025-12-05 20:35:23.249924991 +0000 UTC m=+1493.479736359" Dec 05 20:35:23 crc kubenswrapper[4744]: I1205 20:35:23.278128 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.278112083 podStartE2EDuration="2.278112083s" podCreationTimestamp="2025-12-05 20:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:23.268830236 +0000 UTC m=+1493.498641604" watchObservedRunningTime="2025-12-05 20:35:23.278112083 +0000 UTC m=+1493.507923451" Dec 05 20:35:23 crc kubenswrapper[4744]: I1205 20:35:23.300723 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.300700898 podStartE2EDuration="2.300700898s" podCreationTimestamp="2025-12-05 20:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:23.293644285 +0000 UTC m=+1493.523455653" watchObservedRunningTime="2025-12-05 20:35:23.300700898 +0000 UTC m=+1493.530512266" Dec 05 20:35:25 crc kubenswrapper[4744]: I1205 20:35:25.700587 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:26 crc kubenswrapper[4744]: I1205 20:35:26.924364 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:26 crc kubenswrapper[4744]: I1205 20:35:26.972436 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:31 crc kubenswrapper[4744]: I1205 20:35:31.840858 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:31 crc kubenswrapper[4744]: I1205 20:35:31.880469 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:31 crc kubenswrapper[4744]: I1205 20:35:31.923811 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:31 crc kubenswrapper[4744]: I1205 20:35:31.947538 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:31 crc kubenswrapper[4744]: I1205 20:35:31.972486 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:31 crc kubenswrapper[4744]: I1205 20:35:31.994975 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:32 crc kubenswrapper[4744]: I1205 20:35:32.312644 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:32 crc kubenswrapper[4744]: I1205 20:35:32.321610 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:32 crc kubenswrapper[4744]: I1205 20:35:32.336849 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:32 crc kubenswrapper[4744]: I1205 20:35:32.339764 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:34 crc kubenswrapper[4744]: I1205 20:35:34.481594 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:34 crc kubenswrapper[4744]: I1205 20:35:34.482148 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="45503749-4795-44ec-9172-565149962da5" containerName="ceilometer-central-agent" containerID="cri-o://3861621ea0a0644f5ea217c77e8f7e66056d6459ade6fa5bc18a7c4e4cff7cbd" gracePeriod=30 Dec 05 20:35:34 crc kubenswrapper[4744]: I1205 20:35:34.482396 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="45503749-4795-44ec-9172-565149962da5" containerName="proxy-httpd" containerID="cri-o://0574b8b042f640d5d4db6499b87c41d8148a3cec3b3cc5f07ba8bc9d79d7dcd4" gracePeriod=30 Dec 05 20:35:34 crc kubenswrapper[4744]: I1205 20:35:34.482455 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="45503749-4795-44ec-9172-565149962da5" containerName="sg-core" containerID="cri-o://4c06099961a50968883f647a1e3c92069ad610c8f0970debef2f2c4dc1283184" gracePeriod=30 Dec 05 20:35:34 crc kubenswrapper[4744]: I1205 20:35:34.482503 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="45503749-4795-44ec-9172-565149962da5" containerName="ceilometer-notification-agent" containerID="cri-o://e852bbc9521245d62ca37c3281b23b1216fdf473d30a2f374be1c046ab15aa00" gracePeriod=30 Dec 05 20:35:34 crc kubenswrapper[4744]: I1205 20:35:34.504370 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:35 crc kubenswrapper[4744]: I1205 20:35:35.339319 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"45503749-4795-44ec-9172-565149962da5","Type":"ContainerDied","Data":"0574b8b042f640d5d4db6499b87c41d8148a3cec3b3cc5f07ba8bc9d79d7dcd4"} Dec 05 20:35:35 crc kubenswrapper[4744]: I1205 20:35:35.339284 4744 generic.go:334] "Generic (PLEG): container finished" podID="45503749-4795-44ec-9172-565149962da5" containerID="0574b8b042f640d5d4db6499b87c41d8148a3cec3b3cc5f07ba8bc9d79d7dcd4" exitCode=0 Dec 05 20:35:35 crc kubenswrapper[4744]: I1205 20:35:35.339403 4744 generic.go:334] "Generic (PLEG): container finished" podID="45503749-4795-44ec-9172-565149962da5" containerID="4c06099961a50968883f647a1e3c92069ad610c8f0970debef2f2c4dc1283184" exitCode=2 Dec 05 20:35:35 crc kubenswrapper[4744]: I1205 20:35:35.339416 4744 generic.go:334] "Generic (PLEG): container finished" podID="45503749-4795-44ec-9172-565149962da5" containerID="3861621ea0a0644f5ea217c77e8f7e66056d6459ade6fa5bc18a7c4e4cff7cbd" exitCode=0 Dec 05 20:35:35 crc kubenswrapper[4744]: I1205 20:35:35.339459 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"45503749-4795-44ec-9172-565149962da5","Type":"ContainerDied","Data":"4c06099961a50968883f647a1e3c92069ad610c8f0970debef2f2c4dc1283184"} Dec 05 20:35:35 crc kubenswrapper[4744]: I1205 20:35:35.339476 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"45503749-4795-44ec-9172-565149962da5","Type":"ContainerDied","Data":"3861621ea0a0644f5ea217c77e8f7e66056d6459ade6fa5bc18a7c4e4cff7cbd"} Dec 05 20:35:36 crc kubenswrapper[4744]: I1205 20:35:36.568658 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:35:36 crc kubenswrapper[4744]: I1205 20:35:36.569208 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="c5d5d7e4-91d1-470a-aa50-c57c025983ad" containerName="watcher-kuttl-api-log" containerID="cri-o://c38c7b2ebb0364adc4c8f920e538c5e5d2fdfc4b1af99e4251da581a9fe188e2" gracePeriod=30 Dec 05 20:35:36 crc kubenswrapper[4744]: I1205 20:35:36.569500 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="c5d5d7e4-91d1-470a-aa50-c57c025983ad" containerName="watcher-api" containerID="cri-o://ce65cec1fa37db55a60b2638bad6fb0cd02b0465a92603caa1f76ec5679b8d51" gracePeriod=30 Dec 05 20:35:37 crc kubenswrapper[4744]: I1205 20:35:37.355514 4744 generic.go:334] "Generic (PLEG): container finished" podID="c5d5d7e4-91d1-470a-aa50-c57c025983ad" containerID="c38c7b2ebb0364adc4c8f920e538c5e5d2fdfc4b1af99e4251da581a9fe188e2" exitCode=143 Dec 05 20:35:37 crc kubenswrapper[4744]: I1205 20:35:37.355574 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c5d5d7e4-91d1-470a-aa50-c57c025983ad","Type":"ContainerDied","Data":"c38c7b2ebb0364adc4c8f920e538c5e5d2fdfc4b1af99e4251da581a9fe188e2"} Dec 05 20:35:37 crc kubenswrapper[4744]: I1205 20:35:37.433973 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="c5d5d7e4-91d1-470a-aa50-c57c025983ad" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"https://10.217.0.151:9322/\": read tcp 10.217.0.2:32808->10.217.0.151:9322: read: connection reset by peer" Dec 05 20:35:37 crc kubenswrapper[4744]: I1205 20:35:37.434115 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="c5d5d7e4-91d1-470a-aa50-c57c025983ad" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.151:9322/\": read tcp 10.217.0.2:32802->10.217.0.151:9322: read: connection reset by peer" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.007174 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.018795 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.125145 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-combined-ca-bundle\") pod \"45503749-4795-44ec-9172-565149962da5\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.125511 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-internal-tls-certs\") pod \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.125546 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkj2m\" (UniqueName: \"kubernetes.io/projected/45503749-4795-44ec-9172-565149962da5-kube-api-access-zkj2m\") pod \"45503749-4795-44ec-9172-565149962da5\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.125597 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-scripts\") pod \"45503749-4795-44ec-9172-565149962da5\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.125656 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d5d7e4-91d1-470a-aa50-c57c025983ad-logs\") pod \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.125725 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45503749-4795-44ec-9172-565149962da5-run-httpd\") pod \"45503749-4795-44ec-9172-565149962da5\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.125770 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-config-data\") pod \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.125794 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-config-data\") pod \"45503749-4795-44ec-9172-565149962da5\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.125815 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-sg-core-conf-yaml\") pod \"45503749-4795-44ec-9172-565149962da5\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.125838 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-ceilometer-tls-certs\") pod \"45503749-4795-44ec-9172-565149962da5\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.125866 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-combined-ca-bundle\") pod \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.125912 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-public-tls-certs\") pod \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.125935 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-custom-prometheus-ca\") pod \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.125960 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sfbf\" (UniqueName: \"kubernetes.io/projected/c5d5d7e4-91d1-470a-aa50-c57c025983ad-kube-api-access-2sfbf\") pod \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\" (UID: \"c5d5d7e4-91d1-470a-aa50-c57c025983ad\") " Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.125987 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45503749-4795-44ec-9172-565149962da5-log-httpd\") pod \"45503749-4795-44ec-9172-565149962da5\" (UID: \"45503749-4795-44ec-9172-565149962da5\") " Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.127325 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45503749-4795-44ec-9172-565149962da5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "45503749-4795-44ec-9172-565149962da5" (UID: "45503749-4795-44ec-9172-565149962da5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.127723 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45503749-4795-44ec-9172-565149962da5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "45503749-4795-44ec-9172-565149962da5" (UID: "45503749-4795-44ec-9172-565149962da5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.128053 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5d5d7e4-91d1-470a-aa50-c57c025983ad-logs" (OuterVolumeSpecName: "logs") pod "c5d5d7e4-91d1-470a-aa50-c57c025983ad" (UID: "c5d5d7e4-91d1-470a-aa50-c57c025983ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.139916 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-scripts" (OuterVolumeSpecName: "scripts") pod "45503749-4795-44ec-9172-565149962da5" (UID: "45503749-4795-44ec-9172-565149962da5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.148508 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45503749-4795-44ec-9172-565149962da5-kube-api-access-zkj2m" (OuterVolumeSpecName: "kube-api-access-zkj2m") pod "45503749-4795-44ec-9172-565149962da5" (UID: "45503749-4795-44ec-9172-565149962da5"). InnerVolumeSpecName "kube-api-access-zkj2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.169942 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5d5d7e4-91d1-470a-aa50-c57c025983ad-kube-api-access-2sfbf" (OuterVolumeSpecName: "kube-api-access-2sfbf") pod "c5d5d7e4-91d1-470a-aa50-c57c025983ad" (UID: "c5d5d7e4-91d1-470a-aa50-c57c025983ad"). InnerVolumeSpecName "kube-api-access-2sfbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.200497 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "45503749-4795-44ec-9172-565149962da5" (UID: "45503749-4795-44ec-9172-565149962da5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.203443 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5d5d7e4-91d1-470a-aa50-c57c025983ad" (UID: "c5d5d7e4-91d1-470a-aa50-c57c025983ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.204916 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "c5d5d7e4-91d1-470a-aa50-c57c025983ad" (UID: "c5d5d7e4-91d1-470a-aa50-c57c025983ad"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.220405 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "45503749-4795-44ec-9172-565149962da5" (UID: "45503749-4795-44ec-9172-565149962da5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.227441 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45503749-4795-44ec-9172-565149962da5-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.227466 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.227475 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.227483 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.227492 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.227501 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sfbf\" (UniqueName: \"kubernetes.io/projected/c5d5d7e4-91d1-470a-aa50-c57c025983ad-kube-api-access-2sfbf\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.227510 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45503749-4795-44ec-9172-565149962da5-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.227517 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkj2m\" (UniqueName: \"kubernetes.io/projected/45503749-4795-44ec-9172-565149962da5-kube-api-access-zkj2m\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.227525 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.227533 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d5d7e4-91d1-470a-aa50-c57c025983ad-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.240186 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-config-data" (OuterVolumeSpecName: "config-data") pod "c5d5d7e4-91d1-470a-aa50-c57c025983ad" (UID: "c5d5d7e4-91d1-470a-aa50-c57c025983ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.255457 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45503749-4795-44ec-9172-565149962da5" (UID: "45503749-4795-44ec-9172-565149962da5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.261484 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c5d5d7e4-91d1-470a-aa50-c57c025983ad" (UID: "c5d5d7e4-91d1-470a-aa50-c57c025983ad"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.270813 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c5d5d7e4-91d1-470a-aa50-c57c025983ad" (UID: "c5d5d7e4-91d1-470a-aa50-c57c025983ad"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.283673 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-config-data" (OuterVolumeSpecName: "config-data") pod "45503749-4795-44ec-9172-565149962da5" (UID: "45503749-4795-44ec-9172-565149962da5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.328777 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.328807 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.328816 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.328827 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45503749-4795-44ec-9172-565149962da5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.328836 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d5d7e4-91d1-470a-aa50-c57c025983ad-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.366278 4744 generic.go:334] "Generic (PLEG): container finished" podID="45503749-4795-44ec-9172-565149962da5" containerID="e852bbc9521245d62ca37c3281b23b1216fdf473d30a2f374be1c046ab15aa00" exitCode=0 Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.366348 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.366353 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"45503749-4795-44ec-9172-565149962da5","Type":"ContainerDied","Data":"e852bbc9521245d62ca37c3281b23b1216fdf473d30a2f374be1c046ab15aa00"} Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.366437 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"45503749-4795-44ec-9172-565149962da5","Type":"ContainerDied","Data":"10025c21f84e214ccbd3195adb85cf6b076c3bdc759a5e0b05d315c959ba46d5"} Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.366484 4744 scope.go:117] "RemoveContainer" containerID="0574b8b042f640d5d4db6499b87c41d8148a3cec3b3cc5f07ba8bc9d79d7dcd4" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.371903 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.371917 4744 generic.go:334] "Generic (PLEG): container finished" podID="c5d5d7e4-91d1-470a-aa50-c57c025983ad" containerID="ce65cec1fa37db55a60b2638bad6fb0cd02b0465a92603caa1f76ec5679b8d51" exitCode=0 Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.371943 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c5d5d7e4-91d1-470a-aa50-c57c025983ad","Type":"ContainerDied","Data":"ce65cec1fa37db55a60b2638bad6fb0cd02b0465a92603caa1f76ec5679b8d51"} Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.372243 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c5d5d7e4-91d1-470a-aa50-c57c025983ad","Type":"ContainerDied","Data":"3a1c1169deaca1dbb61659ea3373827d54545720db271fb6ddd00d9fb3f728a5"} Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.404253 4744 scope.go:117] "RemoveContainer" containerID="4c06099961a50968883f647a1e3c92069ad610c8f0970debef2f2c4dc1283184" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.419924 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.432804 4744 scope.go:117] "RemoveContainer" containerID="e852bbc9521245d62ca37c3281b23b1216fdf473d30a2f374be1c046ab15aa00" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.444637 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.452302 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.466109 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.471756 4744 scope.go:117] "RemoveContainer" containerID="3861621ea0a0644f5ea217c77e8f7e66056d6459ade6fa5bc18a7c4e4cff7cbd" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.471943 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:38 crc kubenswrapper[4744]: E1205 20:35:38.472386 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d5d7e4-91d1-470a-aa50-c57c025983ad" containerName="watcher-api" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.472404 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d5d7e4-91d1-470a-aa50-c57c025983ad" containerName="watcher-api" Dec 05 20:35:38 crc kubenswrapper[4744]: E1205 20:35:38.472422 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45503749-4795-44ec-9172-565149962da5" containerName="ceilometer-central-agent" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.472432 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="45503749-4795-44ec-9172-565149962da5" containerName="ceilometer-central-agent" Dec 05 20:35:38 crc kubenswrapper[4744]: E1205 20:35:38.472452 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45503749-4795-44ec-9172-565149962da5" containerName="sg-core" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.472460 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="45503749-4795-44ec-9172-565149962da5" containerName="sg-core" Dec 05 20:35:38 crc kubenswrapper[4744]: E1205 20:35:38.472477 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45503749-4795-44ec-9172-565149962da5" containerName="ceilometer-notification-agent" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.472486 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="45503749-4795-44ec-9172-565149962da5" containerName="ceilometer-notification-agent" Dec 05 20:35:38 crc kubenswrapper[4744]: E1205 20:35:38.472506 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d5d7e4-91d1-470a-aa50-c57c025983ad" containerName="watcher-kuttl-api-log" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.472515 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d5d7e4-91d1-470a-aa50-c57c025983ad" containerName="watcher-kuttl-api-log" Dec 05 20:35:38 crc kubenswrapper[4744]: E1205 20:35:38.472530 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45503749-4795-44ec-9172-565149962da5" containerName="proxy-httpd" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.472538 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="45503749-4795-44ec-9172-565149962da5" containerName="proxy-httpd" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.473077 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="45503749-4795-44ec-9172-565149962da5" containerName="ceilometer-central-agent" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.473107 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="45503749-4795-44ec-9172-565149962da5" containerName="proxy-httpd" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.473118 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="45503749-4795-44ec-9172-565149962da5" containerName="ceilometer-notification-agent" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.473141 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d5d7e4-91d1-470a-aa50-c57c025983ad" containerName="watcher-api" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.473156 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d5d7e4-91d1-470a-aa50-c57c025983ad" containerName="watcher-kuttl-api-log" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.473173 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="45503749-4795-44ec-9172-565149962da5" containerName="sg-core" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.475459 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.477361 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.478909 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.479084 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.479625 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.479878 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.481191 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.481405 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.481561 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.488187 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.508519 4744 scope.go:117] "RemoveContainer" containerID="0574b8b042f640d5d4db6499b87c41d8148a3cec3b3cc5f07ba8bc9d79d7dcd4" Dec 05 20:35:38 crc kubenswrapper[4744]: E1205 20:35:38.509585 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0574b8b042f640d5d4db6499b87c41d8148a3cec3b3cc5f07ba8bc9d79d7dcd4\": container with ID starting with 0574b8b042f640d5d4db6499b87c41d8148a3cec3b3cc5f07ba8bc9d79d7dcd4 not found: ID does not exist" containerID="0574b8b042f640d5d4db6499b87c41d8148a3cec3b3cc5f07ba8bc9d79d7dcd4" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.509674 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0574b8b042f640d5d4db6499b87c41d8148a3cec3b3cc5f07ba8bc9d79d7dcd4"} err="failed to get container status \"0574b8b042f640d5d4db6499b87c41d8148a3cec3b3cc5f07ba8bc9d79d7dcd4\": rpc error: code = NotFound desc = could not find container \"0574b8b042f640d5d4db6499b87c41d8148a3cec3b3cc5f07ba8bc9d79d7dcd4\": container with ID starting with 0574b8b042f640d5d4db6499b87c41d8148a3cec3b3cc5f07ba8bc9d79d7dcd4 not found: ID does not exist" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.509747 4744 scope.go:117] "RemoveContainer" containerID="4c06099961a50968883f647a1e3c92069ad610c8f0970debef2f2c4dc1283184" Dec 05 20:35:38 crc kubenswrapper[4744]: E1205 20:35:38.510337 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c06099961a50968883f647a1e3c92069ad610c8f0970debef2f2c4dc1283184\": container with ID starting with 4c06099961a50968883f647a1e3c92069ad610c8f0970debef2f2c4dc1283184 not found: ID does not exist" containerID="4c06099961a50968883f647a1e3c92069ad610c8f0970debef2f2c4dc1283184" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.510506 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c06099961a50968883f647a1e3c92069ad610c8f0970debef2f2c4dc1283184"} err="failed to get container status \"4c06099961a50968883f647a1e3c92069ad610c8f0970debef2f2c4dc1283184\": rpc error: code = NotFound desc = could not find container \"4c06099961a50968883f647a1e3c92069ad610c8f0970debef2f2c4dc1283184\": container with ID starting with 4c06099961a50968883f647a1e3c92069ad610c8f0970debef2f2c4dc1283184 not found: ID does not exist" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.510607 4744 scope.go:117] "RemoveContainer" containerID="e852bbc9521245d62ca37c3281b23b1216fdf473d30a2f374be1c046ab15aa00" Dec 05 20:35:38 crc kubenswrapper[4744]: E1205 20:35:38.511777 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e852bbc9521245d62ca37c3281b23b1216fdf473d30a2f374be1c046ab15aa00\": container with ID starting with e852bbc9521245d62ca37c3281b23b1216fdf473d30a2f374be1c046ab15aa00 not found: ID does not exist" containerID="e852bbc9521245d62ca37c3281b23b1216fdf473d30a2f374be1c046ab15aa00" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.511816 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e852bbc9521245d62ca37c3281b23b1216fdf473d30a2f374be1c046ab15aa00"} err="failed to get container status \"e852bbc9521245d62ca37c3281b23b1216fdf473d30a2f374be1c046ab15aa00\": rpc error: code = NotFound desc = could not find container \"e852bbc9521245d62ca37c3281b23b1216fdf473d30a2f374be1c046ab15aa00\": container with ID starting with e852bbc9521245d62ca37c3281b23b1216fdf473d30a2f374be1c046ab15aa00 not found: ID does not exist" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.511845 4744 scope.go:117] "RemoveContainer" containerID="3861621ea0a0644f5ea217c77e8f7e66056d6459ade6fa5bc18a7c4e4cff7cbd" Dec 05 20:35:38 crc kubenswrapper[4744]: E1205 20:35:38.512313 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3861621ea0a0644f5ea217c77e8f7e66056d6459ade6fa5bc18a7c4e4cff7cbd\": container with ID starting with 3861621ea0a0644f5ea217c77e8f7e66056d6459ade6fa5bc18a7c4e4cff7cbd not found: ID does not exist" containerID="3861621ea0a0644f5ea217c77e8f7e66056d6459ade6fa5bc18a7c4e4cff7cbd" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.512519 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3861621ea0a0644f5ea217c77e8f7e66056d6459ade6fa5bc18a7c4e4cff7cbd"} err="failed to get container status \"3861621ea0a0644f5ea217c77e8f7e66056d6459ade6fa5bc18a7c4e4cff7cbd\": rpc error: code = NotFound desc = could not find container \"3861621ea0a0644f5ea217c77e8f7e66056d6459ade6fa5bc18a7c4e4cff7cbd\": container with ID starting with 3861621ea0a0644f5ea217c77e8f7e66056d6459ade6fa5bc18a7c4e4cff7cbd not found: ID does not exist" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.512555 4744 scope.go:117] "RemoveContainer" containerID="ce65cec1fa37db55a60b2638bad6fb0cd02b0465a92603caa1f76ec5679b8d51" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.512719 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.531021 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.531075 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-run-httpd\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.531095 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-config-data\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.531115 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.531149 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-log-httpd\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.531179 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.531324 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h49mb\" (UniqueName: \"kubernetes.io/projected/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-kube-api-access-h49mb\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.531352 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-scripts\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.541317 4744 scope.go:117] "RemoveContainer" containerID="c38c7b2ebb0364adc4c8f920e538c5e5d2fdfc4b1af99e4251da581a9fe188e2" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.561167 4744 scope.go:117] "RemoveContainer" containerID="ce65cec1fa37db55a60b2638bad6fb0cd02b0465a92603caa1f76ec5679b8d51" Dec 05 20:35:38 crc kubenswrapper[4744]: E1205 20:35:38.561934 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce65cec1fa37db55a60b2638bad6fb0cd02b0465a92603caa1f76ec5679b8d51\": container with ID starting with ce65cec1fa37db55a60b2638bad6fb0cd02b0465a92603caa1f76ec5679b8d51 not found: ID does not exist" containerID="ce65cec1fa37db55a60b2638bad6fb0cd02b0465a92603caa1f76ec5679b8d51" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.561967 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce65cec1fa37db55a60b2638bad6fb0cd02b0465a92603caa1f76ec5679b8d51"} err="failed to get container status \"ce65cec1fa37db55a60b2638bad6fb0cd02b0465a92603caa1f76ec5679b8d51\": rpc error: code = NotFound desc = could not find container \"ce65cec1fa37db55a60b2638bad6fb0cd02b0465a92603caa1f76ec5679b8d51\": container with ID starting with ce65cec1fa37db55a60b2638bad6fb0cd02b0465a92603caa1f76ec5679b8d51 not found: ID does not exist" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.561988 4744 scope.go:117] "RemoveContainer" containerID="c38c7b2ebb0364adc4c8f920e538c5e5d2fdfc4b1af99e4251da581a9fe188e2" Dec 05 20:35:38 crc kubenswrapper[4744]: E1205 20:35:38.562277 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c38c7b2ebb0364adc4c8f920e538c5e5d2fdfc4b1af99e4251da581a9fe188e2\": container with ID starting with c38c7b2ebb0364adc4c8f920e538c5e5d2fdfc4b1af99e4251da581a9fe188e2 not found: ID does not exist" containerID="c38c7b2ebb0364adc4c8f920e538c5e5d2fdfc4b1af99e4251da581a9fe188e2" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.562317 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c38c7b2ebb0364adc4c8f920e538c5e5d2fdfc4b1af99e4251da581a9fe188e2"} err="failed to get container status \"c38c7b2ebb0364adc4c8f920e538c5e5d2fdfc4b1af99e4251da581a9fe188e2\": rpc error: code = NotFound desc = could not find container \"c38c7b2ebb0364adc4c8f920e538c5e5d2fdfc4b1af99e4251da581a9fe188e2\": container with ID starting with c38c7b2ebb0364adc4c8f920e538c5e5d2fdfc4b1af99e4251da581a9fe188e2 not found: ID does not exist" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.632266 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.632525 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6vtw\" (UniqueName: \"kubernetes.io/projected/bfab7c4f-2ead-4d5c-9030-470b94cbf936-kube-api-access-r6vtw\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.632651 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.632730 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.632818 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h49mb\" (UniqueName: \"kubernetes.io/projected/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-kube-api-access-h49mb\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.632908 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-scripts\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.632992 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.633082 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.633185 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-run-httpd\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.633276 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-config-data\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.633398 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.633539 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-log-httpd\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.633654 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-run-httpd\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.633658 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfab7c4f-2ead-4d5c-9030-470b94cbf936-logs\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.633812 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.633931 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.633995 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-log-httpd\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.636460 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.637847 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-scripts\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.638602 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.640805 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.641385 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-config-data\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.654985 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h49mb\" (UniqueName: \"kubernetes.io/projected/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-kube-api-access-h49mb\") pod \"ceilometer-0\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.735907 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfab7c4f-2ead-4d5c-9030-470b94cbf936-logs\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.735966 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.736011 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.736040 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6vtw\" (UniqueName: \"kubernetes.io/projected/bfab7c4f-2ead-4d5c-9030-470b94cbf936-kube-api-access-r6vtw\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.736074 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.736098 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.736586 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfab7c4f-2ead-4d5c-9030-470b94cbf936-logs\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.736698 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.739437 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.739778 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.740195 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.741205 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.742883 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.754183 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6vtw\" (UniqueName: \"kubernetes.io/projected/bfab7c4f-2ead-4d5c-9030-470b94cbf936-kube-api-access-r6vtw\") pod \"watcher-kuttl-api-0\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.813211 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:38 crc kubenswrapper[4744]: I1205 20:35:38.822956 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:39 crc kubenswrapper[4744]: I1205 20:35:39.335925 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:35:39 crc kubenswrapper[4744]: I1205 20:35:39.385225 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"bfab7c4f-2ead-4d5c-9030-470b94cbf936","Type":"ContainerStarted","Data":"5624a72fa6c927571fc15edb5462b04b24e0c6af2e1088ccc1144fd86847aab1"} Dec 05 20:35:39 crc kubenswrapper[4744]: I1205 20:35:39.427925 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:39 crc kubenswrapper[4744]: W1205 20:35:39.428812 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod890ee3bb_c943_4b52_9a6e_d97ab4a5d969.slice/crio-8be208a5f22bd304f99e2de8d3c242f7758e42531faeaa1cb3c66265289127b3 WatchSource:0}: Error finding container 8be208a5f22bd304f99e2de8d3c242f7758e42531faeaa1cb3c66265289127b3: Status 404 returned error can't find the container with id 8be208a5f22bd304f99e2de8d3c242f7758e42531faeaa1cb3c66265289127b3 Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.094453 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45503749-4795-44ec-9172-565149962da5" path="/var/lib/kubelet/pods/45503749-4795-44ec-9172-565149962da5/volumes" Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.095696 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5d5d7e4-91d1-470a-aa50-c57c025983ad" path="/var/lib/kubelet/pods/c5d5d7e4-91d1-470a-aa50-c57c025983ad/volumes" Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.399142 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"bfab7c4f-2ead-4d5c-9030-470b94cbf936","Type":"ContainerStarted","Data":"2e5d0040bb704adea51704072303040d42f2e0694921d4bef0fd0441f6d1ff35"} Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.399400 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"bfab7c4f-2ead-4d5c-9030-470b94cbf936","Type":"ContainerStarted","Data":"cd85c268b7e888769d3bb1dd6313481b1925e14d09672f9d8040b57b3066285b"} Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.399420 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.401558 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"890ee3bb-c943-4b52-9a6e-d97ab4a5d969","Type":"ContainerStarted","Data":"e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29"} Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.401599 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"890ee3bb-c943-4b52-9a6e-d97ab4a5d969","Type":"ContainerStarted","Data":"8be208a5f22bd304f99e2de8d3c242f7758e42531faeaa1cb3c66265289127b3"} Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.421484 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.421464005 podStartE2EDuration="2.421464005s" podCreationTimestamp="2025-12-05 20:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:35:40.417357525 +0000 UTC m=+1510.647168903" watchObservedRunningTime="2025-12-05 20:35:40.421464005 +0000 UTC m=+1510.651275373" Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.672140 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-n95fx"] Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.680344 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-n95fx"] Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.712159 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watchere303-account-delete-ktlj9"] Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.713575 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchere303-account-delete-ktlj9" Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.727268 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchere303-account-delete-ktlj9"] Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.782227 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e87e619c-7589-4c51-89f4-16225dfa63db-operator-scripts\") pod \"watchere303-account-delete-ktlj9\" (UID: \"e87e619c-7589-4c51-89f4-16225dfa63db\") " pod="watcher-kuttl-default/watchere303-account-delete-ktlj9" Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.782387 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nkdt\" (UniqueName: \"kubernetes.io/projected/e87e619c-7589-4c51-89f4-16225dfa63db-kube-api-access-9nkdt\") pod \"watchere303-account-delete-ktlj9\" (UID: \"e87e619c-7589-4c51-89f4-16225dfa63db\") " pod="watcher-kuttl-default/watchere303-account-delete-ktlj9" Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.790851 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.791060 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="545e6f9e-ddf0-43e6-b712-897b54463135" containerName="watcher-decision-engine" containerID="cri-o://a4688663e4fc5efe4bcd8bb454f4e6c054692cbbb8a506ebaf2deb646456aa4f" gracePeriod=30 Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.885349 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.891265 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e87e619c-7589-4c51-89f4-16225dfa63db-operator-scripts\") pod \"watchere303-account-delete-ktlj9\" (UID: \"e87e619c-7589-4c51-89f4-16225dfa63db\") " pod="watcher-kuttl-default/watchere303-account-delete-ktlj9" Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.892134 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e87e619c-7589-4c51-89f4-16225dfa63db-operator-scripts\") pod \"watchere303-account-delete-ktlj9\" (UID: \"e87e619c-7589-4c51-89f4-16225dfa63db\") " pod="watcher-kuttl-default/watchere303-account-delete-ktlj9" Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.892373 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nkdt\" (UniqueName: \"kubernetes.io/projected/e87e619c-7589-4c51-89f4-16225dfa63db-kube-api-access-9nkdt\") pod \"watchere303-account-delete-ktlj9\" (UID: \"e87e619c-7589-4c51-89f4-16225dfa63db\") " pod="watcher-kuttl-default/watchere303-account-delete-ktlj9" Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.930823 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nkdt\" (UniqueName: \"kubernetes.io/projected/e87e619c-7589-4c51-89f4-16225dfa63db-kube-api-access-9nkdt\") pod \"watchere303-account-delete-ktlj9\" (UID: \"e87e619c-7589-4c51-89f4-16225dfa63db\") " pod="watcher-kuttl-default/watchere303-account-delete-ktlj9" Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.946765 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:35:40 crc kubenswrapper[4744]: I1205 20:35:40.946967 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="bd44a318-1763-4262-98ba-76c75ca8154b" containerName="watcher-applier" containerID="cri-o://c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428" gracePeriod=30 Dec 05 20:35:41 crc kubenswrapper[4744]: I1205 20:35:41.038057 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchere303-account-delete-ktlj9" Dec 05 20:35:41 crc kubenswrapper[4744]: I1205 20:35:41.409920 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"890ee3bb-c943-4b52-9a6e-d97ab4a5d969","Type":"ContainerStarted","Data":"c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a"} Dec 05 20:35:41 crc kubenswrapper[4744]: I1205 20:35:41.410335 4744 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="watcher-kuttl-default/watcher-kuttl-api-0" secret="" err="secret \"watcher-watcher-kuttl-dockercfg-2df5p\" not found" Dec 05 20:35:41 crc kubenswrapper[4744]: E1205 20:35:41.503624 4744 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-api-config-data: secret "watcher-kuttl-api-config-data" not found Dec 05 20:35:41 crc kubenswrapper[4744]: E1205 20:35:41.503701 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-config-data podName:bfab7c4f-2ead-4d5c-9030-470b94cbf936 nodeName:}" failed. No retries permitted until 2025-12-05 20:35:42.003681568 +0000 UTC m=+1512.233492936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-config-data") pod "watcher-kuttl-api-0" (UID: "bfab7c4f-2ead-4d5c-9030-470b94cbf936") : secret "watcher-kuttl-api-config-data" not found Dec 05 20:35:41 crc kubenswrapper[4744]: W1205 20:35:41.585741 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode87e619c_7589_4c51_89f4_16225dfa63db.slice/crio-9e0ceefde7ceb320d0bf3f685f228ae971a42b3d8675a5fb9c4280213866971a WatchSource:0}: Error finding container 9e0ceefde7ceb320d0bf3f685f228ae971a42b3d8675a5fb9c4280213866971a: Status 404 returned error can't find the container with id 9e0ceefde7ceb320d0bf3f685f228ae971a42b3d8675a5fb9c4280213866971a Dec 05 20:35:41 crc kubenswrapper[4744]: I1205 20:35:41.597324 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchere303-account-delete-ktlj9"] Dec 05 20:35:41 crc kubenswrapper[4744]: E1205 20:35:41.974799 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:35:41 crc kubenswrapper[4744]: E1205 20:35:41.977556 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:35:41 crc kubenswrapper[4744]: E1205 20:35:41.979383 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:35:41 crc kubenswrapper[4744]: E1205 20:35:41.979455 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="bd44a318-1763-4262-98ba-76c75ca8154b" containerName="watcher-applier" Dec 05 20:35:42 crc kubenswrapper[4744]: E1205 20:35:42.011395 4744 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-api-config-data: secret "watcher-kuttl-api-config-data" not found Dec 05 20:35:42 crc kubenswrapper[4744]: E1205 20:35:42.011473 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-config-data podName:bfab7c4f-2ead-4d5c-9030-470b94cbf936 nodeName:}" failed. No retries permitted until 2025-12-05 20:35:43.011453651 +0000 UTC m=+1513.241265019 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-config-data") pod "watcher-kuttl-api-0" (UID: "bfab7c4f-2ead-4d5c-9030-470b94cbf936") : secret "watcher-kuttl-api-config-data" not found Dec 05 20:35:42 crc kubenswrapper[4744]: I1205 20:35:42.091613 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="042ae314-8ed7-494e-bb91-ac44f4f12097" path="/var/lib/kubelet/pods/042ae314-8ed7-494e-bb91-ac44f4f12097/volumes" Dec 05 20:35:42 crc kubenswrapper[4744]: I1205 20:35:42.420518 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"890ee3bb-c943-4b52-9a6e-d97ab4a5d969","Type":"ContainerStarted","Data":"f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0"} Dec 05 20:35:42 crc kubenswrapper[4744]: I1205 20:35:42.422863 4744 generic.go:334] "Generic (PLEG): container finished" podID="e87e619c-7589-4c51-89f4-16225dfa63db" containerID="9de0d954d9caf7c5d5eb39e4e457edec9303f4562c1386f703c80fe2776f91a8" exitCode=0 Dec 05 20:35:42 crc kubenswrapper[4744]: I1205 20:35:42.422948 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:35:42 crc kubenswrapper[4744]: I1205 20:35:42.423039 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchere303-account-delete-ktlj9" event={"ID":"e87e619c-7589-4c51-89f4-16225dfa63db","Type":"ContainerDied","Data":"9de0d954d9caf7c5d5eb39e4e457edec9303f4562c1386f703c80fe2776f91a8"} Dec 05 20:35:42 crc kubenswrapper[4744]: I1205 20:35:42.423139 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchere303-account-delete-ktlj9" event={"ID":"e87e619c-7589-4c51-89f4-16225dfa63db","Type":"ContainerStarted","Data":"9e0ceefde7ceb320d0bf3f685f228ae971a42b3d8675a5fb9c4280213866971a"} Dec 05 20:35:42 crc kubenswrapper[4744]: I1205 20:35:42.423083 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="bfab7c4f-2ead-4d5c-9030-470b94cbf936" containerName="watcher-kuttl-api-log" containerID="cri-o://cd85c268b7e888769d3bb1dd6313481b1925e14d09672f9d8040b57b3066285b" gracePeriod=30 Dec 05 20:35:42 crc kubenswrapper[4744]: I1205 20:35:42.423444 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="bfab7c4f-2ead-4d5c-9030-470b94cbf936" containerName="watcher-api" containerID="cri-o://2e5d0040bb704adea51704072303040d42f2e0694921d4bef0fd0441f6d1ff35" gracePeriod=30 Dec 05 20:35:42 crc kubenswrapper[4744]: I1205 20:35:42.427281 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="bfab7c4f-2ead-4d5c-9030-470b94cbf936" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.154:9322/\": EOF" Dec 05 20:35:42 crc kubenswrapper[4744]: I1205 20:35:42.428792 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="bfab7c4f-2ead-4d5c-9030-470b94cbf936" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.154:9322/\": read tcp 10.217.0.2:41422->10.217.0.154:9322: read: connection reset by peer" Dec 05 20:35:43 crc kubenswrapper[4744]: E1205 20:35:43.027863 4744 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-api-config-data: secret "watcher-kuttl-api-config-data" not found Dec 05 20:35:43 crc kubenswrapper[4744]: E1205 20:35:43.028196 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-config-data podName:bfab7c4f-2ead-4d5c-9030-470b94cbf936 nodeName:}" failed. No retries permitted until 2025-12-05 20:35:45.028180127 +0000 UTC m=+1515.257991495 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-config-data") pod "watcher-kuttl-api-0" (UID: "bfab7c4f-2ead-4d5c-9030-470b94cbf936") : secret "watcher-kuttl-api-config-data" not found Dec 05 20:35:43 crc kubenswrapper[4744]: I1205 20:35:43.432214 4744 generic.go:334] "Generic (PLEG): container finished" podID="bfab7c4f-2ead-4d5c-9030-470b94cbf936" containerID="cd85c268b7e888769d3bb1dd6313481b1925e14d09672f9d8040b57b3066285b" exitCode=143 Dec 05 20:35:43 crc kubenswrapper[4744]: I1205 20:35:43.432272 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"bfab7c4f-2ead-4d5c-9030-470b94cbf936","Type":"ContainerDied","Data":"cd85c268b7e888769d3bb1dd6313481b1925e14d09672f9d8040b57b3066285b"} Dec 05 20:35:43 crc kubenswrapper[4744]: I1205 20:35:43.435686 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"890ee3bb-c943-4b52-9a6e-d97ab4a5d969","Type":"ContainerStarted","Data":"fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08"} Dec 05 20:35:43 crc kubenswrapper[4744]: I1205 20:35:43.435735 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:43 crc kubenswrapper[4744]: I1205 20:35:43.463476 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.167374988 podStartE2EDuration="5.46345672s" podCreationTimestamp="2025-12-05 20:35:38 +0000 UTC" firstStartedPulling="2025-12-05 20:35:39.43164224 +0000 UTC m=+1509.661453598" lastFinishedPulling="2025-12-05 20:35:42.727723922 +0000 UTC m=+1512.957535330" observedRunningTime="2025-12-05 20:35:43.462105517 +0000 UTC m=+1513.691916885" watchObservedRunningTime="2025-12-05 20:35:43.46345672 +0000 UTC m=+1513.693268088" Dec 05 20:35:43 crc kubenswrapper[4744]: I1205 20:35:43.825678 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:43 crc kubenswrapper[4744]: I1205 20:35:43.923758 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchere303-account-delete-ktlj9" Dec 05 20:35:43 crc kubenswrapper[4744]: I1205 20:35:43.930615 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.054244 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps8rp\" (UniqueName: \"kubernetes.io/projected/bd44a318-1763-4262-98ba-76c75ca8154b-kube-api-access-ps8rp\") pod \"bd44a318-1763-4262-98ba-76c75ca8154b\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.054412 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd44a318-1763-4262-98ba-76c75ca8154b-config-data\") pod \"bd44a318-1763-4262-98ba-76c75ca8154b\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.054448 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nkdt\" (UniqueName: \"kubernetes.io/projected/e87e619c-7589-4c51-89f4-16225dfa63db-kube-api-access-9nkdt\") pod \"e87e619c-7589-4c51-89f4-16225dfa63db\" (UID: \"e87e619c-7589-4c51-89f4-16225dfa63db\") " Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.054539 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e87e619c-7589-4c51-89f4-16225dfa63db-operator-scripts\") pod \"e87e619c-7589-4c51-89f4-16225dfa63db\" (UID: \"e87e619c-7589-4c51-89f4-16225dfa63db\") " Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.054589 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd44a318-1763-4262-98ba-76c75ca8154b-logs\") pod \"bd44a318-1763-4262-98ba-76c75ca8154b\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.054634 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd44a318-1763-4262-98ba-76c75ca8154b-combined-ca-bundle\") pod \"bd44a318-1763-4262-98ba-76c75ca8154b\" (UID: \"bd44a318-1763-4262-98ba-76c75ca8154b\") " Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.055174 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd44a318-1763-4262-98ba-76c75ca8154b-logs" (OuterVolumeSpecName: "logs") pod "bd44a318-1763-4262-98ba-76c75ca8154b" (UID: "bd44a318-1763-4262-98ba-76c75ca8154b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.055200 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e87e619c-7589-4c51-89f4-16225dfa63db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e87e619c-7589-4c51-89f4-16225dfa63db" (UID: "e87e619c-7589-4c51-89f4-16225dfa63db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.060586 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e87e619c-7589-4c51-89f4-16225dfa63db-kube-api-access-9nkdt" (OuterVolumeSpecName: "kube-api-access-9nkdt") pod "e87e619c-7589-4c51-89f4-16225dfa63db" (UID: "e87e619c-7589-4c51-89f4-16225dfa63db"). InnerVolumeSpecName "kube-api-access-9nkdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.060676 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd44a318-1763-4262-98ba-76c75ca8154b-kube-api-access-ps8rp" (OuterVolumeSpecName: "kube-api-access-ps8rp") pod "bd44a318-1763-4262-98ba-76c75ca8154b" (UID: "bd44a318-1763-4262-98ba-76c75ca8154b"). InnerVolumeSpecName "kube-api-access-ps8rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.094379 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd44a318-1763-4262-98ba-76c75ca8154b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd44a318-1763-4262-98ba-76c75ca8154b" (UID: "bd44a318-1763-4262-98ba-76c75ca8154b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.130424 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd44a318-1763-4262-98ba-76c75ca8154b-config-data" (OuterVolumeSpecName: "config-data") pod "bd44a318-1763-4262-98ba-76c75ca8154b" (UID: "bd44a318-1763-4262-98ba-76c75ca8154b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.157136 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd44a318-1763-4262-98ba-76c75ca8154b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.157203 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nkdt\" (UniqueName: \"kubernetes.io/projected/e87e619c-7589-4c51-89f4-16225dfa63db-kube-api-access-9nkdt\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.157218 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e87e619c-7589-4c51-89f4-16225dfa63db-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.157227 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd44a318-1763-4262-98ba-76c75ca8154b-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.157237 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd44a318-1763-4262-98ba-76c75ca8154b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.157246 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps8rp\" (UniqueName: \"kubernetes.io/projected/bd44a318-1763-4262-98ba-76c75ca8154b-kube-api-access-ps8rp\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.445506 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchere303-account-delete-ktlj9" event={"ID":"e87e619c-7589-4c51-89f4-16225dfa63db","Type":"ContainerDied","Data":"9e0ceefde7ceb320d0bf3f685f228ae971a42b3d8675a5fb9c4280213866971a"} Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.445564 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e0ceefde7ceb320d0bf3f685f228ae971a42b3d8675a5fb9c4280213866971a" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.445521 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchere303-account-delete-ktlj9" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.447164 4744 generic.go:334] "Generic (PLEG): container finished" podID="bd44a318-1763-4262-98ba-76c75ca8154b" containerID="c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428" exitCode=0 Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.447375 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"bd44a318-1763-4262-98ba-76c75ca8154b","Type":"ContainerDied","Data":"c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428"} Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.447424 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"bd44a318-1763-4262-98ba-76c75ca8154b","Type":"ContainerDied","Data":"4c82fc08ceaeb43fe94b0617b0504ea9ba708b945da5bd799d77cce32e1cc90a"} Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.447445 4744 scope.go:117] "RemoveContainer" containerID="c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.447492 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.469932 4744 scope.go:117] "RemoveContainer" containerID="c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428" Dec 05 20:35:44 crc kubenswrapper[4744]: E1205 20:35:44.470577 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428\": container with ID starting with c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428 not found: ID does not exist" containerID="c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.470668 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428"} err="failed to get container status \"c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428\": rpc error: code = NotFound desc = could not find container \"c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428\": container with ID starting with c6d8f17d5e39daa88321c80ae48fb85bfdbbf1e3819634663d0564156ceb7428 not found: ID does not exist" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.481819 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.494073 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.592983 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="bfab7c4f-2ead-4d5c-9030-470b94cbf936" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.154:9322/\": read tcp 10.217.0.2:41432->10.217.0.154:9322: read: connection reset by peer" Dec 05 20:35:44 crc kubenswrapper[4744]: I1205 20:35:44.667567 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.041536 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:45 crc kubenswrapper[4744]: E1205 20:35:45.077338 4744 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-api-config-data: secret "watcher-kuttl-api-config-data" not found Dec 05 20:35:45 crc kubenswrapper[4744]: E1205 20:35:45.077414 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-config-data podName:bfab7c4f-2ead-4d5c-9030-470b94cbf936 nodeName:}" failed. No retries permitted until 2025-12-05 20:35:49.077396545 +0000 UTC m=+1519.307207913 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-config-data") pod "watcher-kuttl-api-0" (UID: "bfab7c4f-2ead-4d5c-9030-470b94cbf936") : secret "watcher-kuttl-api-config-data" not found Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.178330 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-internal-tls-certs\") pod \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.178540 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-combined-ca-bundle\") pod \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.178577 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6vtw\" (UniqueName: \"kubernetes.io/projected/bfab7c4f-2ead-4d5c-9030-470b94cbf936-kube-api-access-r6vtw\") pod \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.178607 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-public-tls-certs\") pod \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.178633 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfab7c4f-2ead-4d5c-9030-470b94cbf936-logs\") pod \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.178673 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-config-data\") pod \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.178718 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-custom-prometheus-ca\") pod \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\" (UID: \"bfab7c4f-2ead-4d5c-9030-470b94cbf936\") " Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.179322 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfab7c4f-2ead-4d5c-9030-470b94cbf936-logs" (OuterVolumeSpecName: "logs") pod "bfab7c4f-2ead-4d5c-9030-470b94cbf936" (UID: "bfab7c4f-2ead-4d5c-9030-470b94cbf936"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.179813 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfab7c4f-2ead-4d5c-9030-470b94cbf936-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.220666 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfab7c4f-2ead-4d5c-9030-470b94cbf936-kube-api-access-r6vtw" (OuterVolumeSpecName: "kube-api-access-r6vtw") pod "bfab7c4f-2ead-4d5c-9030-470b94cbf936" (UID: "bfab7c4f-2ead-4d5c-9030-470b94cbf936"). InnerVolumeSpecName "kube-api-access-r6vtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.291301 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6vtw\" (UniqueName: \"kubernetes.io/projected/bfab7c4f-2ead-4d5c-9030-470b94cbf936-kube-api-access-r6vtw\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.291449 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bfab7c4f-2ead-4d5c-9030-470b94cbf936" (UID: "bfab7c4f-2ead-4d5c-9030-470b94cbf936"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.326847 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "bfab7c4f-2ead-4d5c-9030-470b94cbf936" (UID: "bfab7c4f-2ead-4d5c-9030-470b94cbf936"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.335459 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfab7c4f-2ead-4d5c-9030-470b94cbf936" (UID: "bfab7c4f-2ead-4d5c-9030-470b94cbf936"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.375418 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bfab7c4f-2ead-4d5c-9030-470b94cbf936" (UID: "bfab7c4f-2ead-4d5c-9030-470b94cbf936"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.379445 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-config-data" (OuterVolumeSpecName: "config-data") pod "bfab7c4f-2ead-4d5c-9030-470b94cbf936" (UID: "bfab7c4f-2ead-4d5c-9030-470b94cbf936"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.394157 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.394192 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.394201 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.394209 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.394219 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfab7c4f-2ead-4d5c-9030-470b94cbf936-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.457040 4744 generic.go:334] "Generic (PLEG): container finished" podID="bfab7c4f-2ead-4d5c-9030-470b94cbf936" containerID="2e5d0040bb704adea51704072303040d42f2e0694921d4bef0fd0441f6d1ff35" exitCode=0 Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.457101 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"bfab7c4f-2ead-4d5c-9030-470b94cbf936","Type":"ContainerDied","Data":"2e5d0040bb704adea51704072303040d42f2e0694921d4bef0fd0441f6d1ff35"} Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.457118 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.457171 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"bfab7c4f-2ead-4d5c-9030-470b94cbf936","Type":"ContainerDied","Data":"5624a72fa6c927571fc15edb5462b04b24e0c6af2e1088ccc1144fd86847aab1"} Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.457199 4744 scope.go:117] "RemoveContainer" containerID="2e5d0040bb704adea51704072303040d42f2e0694921d4bef0fd0441f6d1ff35" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.478149 4744 scope.go:117] "RemoveContainer" containerID="cd85c268b7e888769d3bb1dd6313481b1925e14d09672f9d8040b57b3066285b" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.499129 4744 scope.go:117] "RemoveContainer" containerID="2e5d0040bb704adea51704072303040d42f2e0694921d4bef0fd0441f6d1ff35" Dec 05 20:35:45 crc kubenswrapper[4744]: E1205 20:35:45.499512 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e5d0040bb704adea51704072303040d42f2e0694921d4bef0fd0441f6d1ff35\": container with ID starting with 2e5d0040bb704adea51704072303040d42f2e0694921d4bef0fd0441f6d1ff35 not found: ID does not exist" containerID="2e5d0040bb704adea51704072303040d42f2e0694921d4bef0fd0441f6d1ff35" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.499564 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e5d0040bb704adea51704072303040d42f2e0694921d4bef0fd0441f6d1ff35"} err="failed to get container status \"2e5d0040bb704adea51704072303040d42f2e0694921d4bef0fd0441f6d1ff35\": rpc error: code = NotFound desc = could not find container \"2e5d0040bb704adea51704072303040d42f2e0694921d4bef0fd0441f6d1ff35\": container with ID starting with 2e5d0040bb704adea51704072303040d42f2e0694921d4bef0fd0441f6d1ff35 not found: ID does not exist" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.499594 4744 scope.go:117] "RemoveContainer" containerID="cd85c268b7e888769d3bb1dd6313481b1925e14d09672f9d8040b57b3066285b" Dec 05 20:35:45 crc kubenswrapper[4744]: E1205 20:35:45.499988 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd85c268b7e888769d3bb1dd6313481b1925e14d09672f9d8040b57b3066285b\": container with ID starting with cd85c268b7e888769d3bb1dd6313481b1925e14d09672f9d8040b57b3066285b not found: ID does not exist" containerID="cd85c268b7e888769d3bb1dd6313481b1925e14d09672f9d8040b57b3066285b" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.500046 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd85c268b7e888769d3bb1dd6313481b1925e14d09672f9d8040b57b3066285b"} err="failed to get container status \"cd85c268b7e888769d3bb1dd6313481b1925e14d09672f9d8040b57b3066285b\": rpc error: code = NotFound desc = could not find container \"cd85c268b7e888769d3bb1dd6313481b1925e14d09672f9d8040b57b3066285b\": container with ID starting with cd85c268b7e888769d3bb1dd6313481b1925e14d09672f9d8040b57b3066285b not found: ID does not exist" Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.523422 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.537413 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.747753 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-2h2w9"] Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.754990 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-2h2w9"] Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.766096 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watchere303-account-delete-ktlj9"] Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.769206 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-e303-account-create-update-vltbz"] Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.775735 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watchere303-account-delete-ktlj9"] Dec 05 20:35:45 crc kubenswrapper[4744]: I1205 20:35:45.784959 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-e303-account-create-update-vltbz"] Dec 05 20:35:46 crc kubenswrapper[4744]: I1205 20:35:46.096709 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a99a6131-64b7-4918-8054-203a827907cc" path="/var/lib/kubelet/pods/a99a6131-64b7-4918-8054-203a827907cc/volumes" Dec 05 20:35:46 crc kubenswrapper[4744]: I1205 20:35:46.097999 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b910857e-8f65-450d-8d46-d14e45db7cf4" path="/var/lib/kubelet/pods/b910857e-8f65-450d-8d46-d14e45db7cf4/volumes" Dec 05 20:35:46 crc kubenswrapper[4744]: I1205 20:35:46.101168 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd44a318-1763-4262-98ba-76c75ca8154b" path="/var/lib/kubelet/pods/bd44a318-1763-4262-98ba-76c75ca8154b/volumes" Dec 05 20:35:46 crc kubenswrapper[4744]: I1205 20:35:46.103558 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfab7c4f-2ead-4d5c-9030-470b94cbf936" path="/var/lib/kubelet/pods/bfab7c4f-2ead-4d5c-9030-470b94cbf936/volumes" Dec 05 20:35:46 crc kubenswrapper[4744]: I1205 20:35:46.106575 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e87e619c-7589-4c51-89f4-16225dfa63db" path="/var/lib/kubelet/pods/e87e619c-7589-4c51-89f4-16225dfa63db/volumes" Dec 05 20:35:46 crc kubenswrapper[4744]: I1205 20:35:46.472322 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="ceilometer-central-agent" containerID="cri-o://e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29" gracePeriod=30 Dec 05 20:35:46 crc kubenswrapper[4744]: I1205 20:35:46.472832 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="proxy-httpd" containerID="cri-o://fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08" gracePeriod=30 Dec 05 20:35:46 crc kubenswrapper[4744]: I1205 20:35:46.472885 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="sg-core" containerID="cri-o://f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0" gracePeriod=30 Dec 05 20:35:46 crc kubenswrapper[4744]: I1205 20:35:46.472924 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="ceilometer-notification-agent" containerID="cri-o://c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a" gracePeriod=30 Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.269745 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.329089 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-ceilometer-tls-certs\") pod \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.329162 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-scripts\") pod \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.329225 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-config-data\") pod \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.329260 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h49mb\" (UniqueName: \"kubernetes.io/projected/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-kube-api-access-h49mb\") pod \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.329332 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-sg-core-conf-yaml\") pod \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.329363 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-run-httpd\") pod \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.329410 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-log-httpd\") pod \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.329447 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-combined-ca-bundle\") pod \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\" (UID: \"890ee3bb-c943-4b52-9a6e-d97ab4a5d969\") " Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.331051 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "890ee3bb-c943-4b52-9a6e-d97ab4a5d969" (UID: "890ee3bb-c943-4b52-9a6e-d97ab4a5d969"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.331552 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "890ee3bb-c943-4b52-9a6e-d97ab4a5d969" (UID: "890ee3bb-c943-4b52-9a6e-d97ab4a5d969"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.344524 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-scripts" (OuterVolumeSpecName: "scripts") pod "890ee3bb-c943-4b52-9a6e-d97ab4a5d969" (UID: "890ee3bb-c943-4b52-9a6e-d97ab4a5d969"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.355586 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-kube-api-access-h49mb" (OuterVolumeSpecName: "kube-api-access-h49mb") pod "890ee3bb-c943-4b52-9a6e-d97ab4a5d969" (UID: "890ee3bb-c943-4b52-9a6e-d97ab4a5d969"). InnerVolumeSpecName "kube-api-access-h49mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.365038 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "890ee3bb-c943-4b52-9a6e-d97ab4a5d969" (UID: "890ee3bb-c943-4b52-9a6e-d97ab4a5d969"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.382005 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "890ee3bb-c943-4b52-9a6e-d97ab4a5d969" (UID: "890ee3bb-c943-4b52-9a6e-d97ab4a5d969"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.393711 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "890ee3bb-c943-4b52-9a6e-d97ab4a5d969" (UID: "890ee3bb-c943-4b52-9a6e-d97ab4a5d969"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.431075 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.431110 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.431119 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.431131 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h49mb\" (UniqueName: \"kubernetes.io/projected/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-kube-api-access-h49mb\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.431139 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.431147 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.431157 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.438469 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-config-data" (OuterVolumeSpecName: "config-data") pod "890ee3bb-c943-4b52-9a6e-d97ab4a5d969" (UID: "890ee3bb-c943-4b52-9a6e-d97ab4a5d969"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.487841 4744 generic.go:334] "Generic (PLEG): container finished" podID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerID="fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08" exitCode=0 Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.487882 4744 generic.go:334] "Generic (PLEG): container finished" podID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerID="f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0" exitCode=2 Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.487896 4744 generic.go:334] "Generic (PLEG): container finished" podID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerID="c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a" exitCode=0 Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.487910 4744 generic.go:334] "Generic (PLEG): container finished" podID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerID="e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29" exitCode=0 Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.487939 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"890ee3bb-c943-4b52-9a6e-d97ab4a5d969","Type":"ContainerDied","Data":"fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08"} Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.487975 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"890ee3bb-c943-4b52-9a6e-d97ab4a5d969","Type":"ContainerDied","Data":"f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0"} Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.487996 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"890ee3bb-c943-4b52-9a6e-d97ab4a5d969","Type":"ContainerDied","Data":"c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a"} Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.488012 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"890ee3bb-c943-4b52-9a6e-d97ab4a5d969","Type":"ContainerDied","Data":"e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29"} Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.488030 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"890ee3bb-c943-4b52-9a6e-d97ab4a5d969","Type":"ContainerDied","Data":"8be208a5f22bd304f99e2de8d3c242f7758e42531faeaa1cb3c66265289127b3"} Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.488053 4744 scope.go:117] "RemoveContainer" containerID="fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.488257 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.512633 4744 scope.go:117] "RemoveContainer" containerID="f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.533312 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/890ee3bb-c943-4b52-9a6e-d97ab4a5d969-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.544200 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.544250 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.553810 4744 scope.go:117] "RemoveContainer" containerID="c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.569341 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:47 crc kubenswrapper[4744]: E1205 20:35:47.569703 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfab7c4f-2ead-4d5c-9030-470b94cbf936" containerName="watcher-kuttl-api-log" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.569722 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfab7c4f-2ead-4d5c-9030-470b94cbf936" containerName="watcher-kuttl-api-log" Dec 05 20:35:47 crc kubenswrapper[4744]: E1205 20:35:47.569736 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="ceilometer-central-agent" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.569743 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="ceilometer-central-agent" Dec 05 20:35:47 crc kubenswrapper[4744]: E1205 20:35:47.569758 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="proxy-httpd" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.569764 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="proxy-httpd" Dec 05 20:35:47 crc kubenswrapper[4744]: E1205 20:35:47.569786 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="ceilometer-notification-agent" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.569792 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="ceilometer-notification-agent" Dec 05 20:35:47 crc kubenswrapper[4744]: E1205 20:35:47.569804 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="sg-core" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.569811 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="sg-core" Dec 05 20:35:47 crc kubenswrapper[4744]: E1205 20:35:47.569819 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd44a318-1763-4262-98ba-76c75ca8154b" containerName="watcher-applier" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.569824 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd44a318-1763-4262-98ba-76c75ca8154b" containerName="watcher-applier" Dec 05 20:35:47 crc kubenswrapper[4744]: E1205 20:35:47.569830 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfab7c4f-2ead-4d5c-9030-470b94cbf936" containerName="watcher-api" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.569835 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfab7c4f-2ead-4d5c-9030-470b94cbf936" containerName="watcher-api" Dec 05 20:35:47 crc kubenswrapper[4744]: E1205 20:35:47.569842 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e87e619c-7589-4c51-89f4-16225dfa63db" containerName="mariadb-account-delete" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.569848 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e87e619c-7589-4c51-89f4-16225dfa63db" containerName="mariadb-account-delete" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.569980 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="proxy-httpd" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.569997 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfab7c4f-2ead-4d5c-9030-470b94cbf936" containerName="watcher-api" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.570005 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="ceilometer-notification-agent" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.570016 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e87e619c-7589-4c51-89f4-16225dfa63db" containerName="mariadb-account-delete" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.570025 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfab7c4f-2ead-4d5c-9030-470b94cbf936" containerName="watcher-kuttl-api-log" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.570034 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="sg-core" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.570042 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd44a318-1763-4262-98ba-76c75ca8154b" containerName="watcher-applier" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.570050 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" containerName="ceilometer-central-agent" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.571505 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.577872 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.578105 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.578255 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.582413 4744 scope.go:117] "RemoveContainer" containerID="e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.583504 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.620127 4744 scope.go:117] "RemoveContainer" containerID="fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08" Dec 05 20:35:47 crc kubenswrapper[4744]: E1205 20:35:47.620654 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08\": container with ID starting with fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08 not found: ID does not exist" containerID="fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.620698 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08"} err="failed to get container status \"fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08\": rpc error: code = NotFound desc = could not find container \"fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08\": container with ID starting with fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08 not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.620730 4744 scope.go:117] "RemoveContainer" containerID="f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0" Dec 05 20:35:47 crc kubenswrapper[4744]: E1205 20:35:47.621056 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0\": container with ID starting with f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0 not found: ID does not exist" containerID="f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.621079 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0"} err="failed to get container status \"f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0\": rpc error: code = NotFound desc = could not find container \"f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0\": container with ID starting with f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0 not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.621094 4744 scope.go:117] "RemoveContainer" containerID="c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a" Dec 05 20:35:47 crc kubenswrapper[4744]: E1205 20:35:47.621475 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a\": container with ID starting with c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a not found: ID does not exist" containerID="c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.621511 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a"} err="failed to get container status \"c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a\": rpc error: code = NotFound desc = could not find container \"c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a\": container with ID starting with c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.621541 4744 scope.go:117] "RemoveContainer" containerID="e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29" Dec 05 20:35:47 crc kubenswrapper[4744]: E1205 20:35:47.622267 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29\": container with ID starting with e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29 not found: ID does not exist" containerID="e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.622319 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29"} err="failed to get container status \"e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29\": rpc error: code = NotFound desc = could not find container \"e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29\": container with ID starting with e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29 not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.622342 4744 scope.go:117] "RemoveContainer" containerID="fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.623092 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08"} err="failed to get container status \"fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08\": rpc error: code = NotFound desc = could not find container \"fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08\": container with ID starting with fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08 not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.623657 4744 scope.go:117] "RemoveContainer" containerID="f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.625695 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0"} err="failed to get container status \"f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0\": rpc error: code = NotFound desc = could not find container \"f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0\": container with ID starting with f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0 not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.625731 4744 scope.go:117] "RemoveContainer" containerID="c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.625909 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a"} err="failed to get container status \"c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a\": rpc error: code = NotFound desc = could not find container \"c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a\": container with ID starting with c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.625930 4744 scope.go:117] "RemoveContainer" containerID="e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.626182 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29"} err="failed to get container status \"e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29\": rpc error: code = NotFound desc = could not find container \"e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29\": container with ID starting with e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29 not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.626224 4744 scope.go:117] "RemoveContainer" containerID="fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.626552 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08"} err="failed to get container status \"fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08\": rpc error: code = NotFound desc = could not find container \"fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08\": container with ID starting with fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08 not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.626581 4744 scope.go:117] "RemoveContainer" containerID="f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.627321 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0"} err="failed to get container status \"f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0\": rpc error: code = NotFound desc = could not find container \"f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0\": container with ID starting with f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0 not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.627363 4744 scope.go:117] "RemoveContainer" containerID="c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.627591 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a"} err="failed to get container status \"c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a\": rpc error: code = NotFound desc = could not find container \"c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a\": container with ID starting with c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.627630 4744 scope.go:117] "RemoveContainer" containerID="e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.627859 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29"} err="failed to get container status \"e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29\": rpc error: code = NotFound desc = could not find container \"e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29\": container with ID starting with e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29 not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.627879 4744 scope.go:117] "RemoveContainer" containerID="fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.628264 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08"} err="failed to get container status \"fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08\": rpc error: code = NotFound desc = could not find container \"fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08\": container with ID starting with fca23a4ed3d4caf61ddbf634dcece50c2e9aaac8c73365feb5c2e00409442e08 not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.628302 4744 scope.go:117] "RemoveContainer" containerID="f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.628509 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0"} err="failed to get container status \"f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0\": rpc error: code = NotFound desc = could not find container \"f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0\": container with ID starting with f972c11ca40f92224f995a086878a56d3d3942b050533c07dd6e7f5c9a7a12c0 not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.628527 4744 scope.go:117] "RemoveContainer" containerID="c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.629015 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a"} err="failed to get container status \"c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a\": rpc error: code = NotFound desc = could not find container \"c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a\": container with ID starting with c97104d008830746c14ca92d88fbba4f89273c472eacb976e72281efabc2183a not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.629046 4744 scope.go:117] "RemoveContainer" containerID="e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.629235 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29"} err="failed to get container status \"e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29\": rpc error: code = NotFound desc = could not find container \"e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29\": container with ID starting with e7daa3312cd8d06ff24bc7ad550fc57c3807654b1dbd52ea09fc64292bf8ba29 not found: ID does not exist" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.634228 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.634303 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8s5m\" (UniqueName: \"kubernetes.io/projected/33b283d3-4309-4156-a59f-3031bd597f19-kube-api-access-t8s5m\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.634330 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33b283d3-4309-4156-a59f-3031bd597f19-run-httpd\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.634398 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33b283d3-4309-4156-a59f-3031bd597f19-log-httpd\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.634456 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.634480 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.634505 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-config-data\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.634589 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-scripts\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.735582 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.735628 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.735651 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-config-data\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.735684 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-scripts\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.735745 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.735768 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8s5m\" (UniqueName: \"kubernetes.io/projected/33b283d3-4309-4156-a59f-3031bd597f19-kube-api-access-t8s5m\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.735784 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33b283d3-4309-4156-a59f-3031bd597f19-run-httpd\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.735801 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33b283d3-4309-4156-a59f-3031bd597f19-log-httpd\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.736227 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33b283d3-4309-4156-a59f-3031bd597f19-log-httpd\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.739321 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.739608 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-scripts\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.741926 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33b283d3-4309-4156-a59f-3031bd597f19-run-httpd\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.744222 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.744221 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-config-data\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.745135 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.761903 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8s5m\" (UniqueName: \"kubernetes.io/projected/33b283d3-4309-4156-a59f-3031bd597f19-kube-api-access-t8s5m\") pod \"ceilometer-0\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:47 crc kubenswrapper[4744]: I1205 20:35:47.890215 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:48 crc kubenswrapper[4744]: I1205 20:35:48.096052 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="890ee3bb-c943-4b52-9a6e-d97ab4a5d969" path="/var/lib/kubelet/pods/890ee3bb-c943-4b52-9a6e-d97ab4a5d969/volumes" Dec 05 20:35:48 crc kubenswrapper[4744]: W1205 20:35:48.390597 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33b283d3_4309_4156_a59f_3031bd597f19.slice/crio-213d4aff9bf96285b40bd4436ed0ef97eee879d4f89aca49498c1453c771bf24 WatchSource:0}: Error finding container 213d4aff9bf96285b40bd4436ed0ef97eee879d4f89aca49498c1453c771bf24: Status 404 returned error can't find the container with id 213d4aff9bf96285b40bd4436ed0ef97eee879d4f89aca49498c1453c771bf24 Dec 05 20:35:48 crc kubenswrapper[4744]: I1205 20:35:48.403946 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:35:48 crc kubenswrapper[4744]: I1205 20:35:48.497885 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"33b283d3-4309-4156-a59f-3031bd597f19","Type":"ContainerStarted","Data":"213d4aff9bf96285b40bd4436ed0ef97eee879d4f89aca49498c1453c771bf24"} Dec 05 20:35:49 crc kubenswrapper[4744]: I1205 20:35:49.070162 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5vtxf"] Dec 05 20:35:49 crc kubenswrapper[4744]: I1205 20:35:49.072055 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5vtxf" Dec 05 20:35:49 crc kubenswrapper[4744]: I1205 20:35:49.082158 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5vtxf"] Dec 05 20:35:49 crc kubenswrapper[4744]: I1205 20:35:49.159422 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2d3bdb-3fb4-4934-a6a6-5943e734a347-utilities\") pod \"certified-operators-5vtxf\" (UID: \"9d2d3bdb-3fb4-4934-a6a6-5943e734a347\") " pod="openshift-marketplace/certified-operators-5vtxf" Dec 05 20:35:49 crc kubenswrapper[4744]: I1205 20:35:49.160428 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm965\" (UniqueName: \"kubernetes.io/projected/9d2d3bdb-3fb4-4934-a6a6-5943e734a347-kube-api-access-jm965\") pod \"certified-operators-5vtxf\" (UID: \"9d2d3bdb-3fb4-4934-a6a6-5943e734a347\") " pod="openshift-marketplace/certified-operators-5vtxf" Dec 05 20:35:49 crc kubenswrapper[4744]: I1205 20:35:49.161347 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2d3bdb-3fb4-4934-a6a6-5943e734a347-catalog-content\") pod \"certified-operators-5vtxf\" (UID: \"9d2d3bdb-3fb4-4934-a6a6-5943e734a347\") " pod="openshift-marketplace/certified-operators-5vtxf" Dec 05 20:35:49 crc kubenswrapper[4744]: I1205 20:35:49.262899 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm965\" (UniqueName: \"kubernetes.io/projected/9d2d3bdb-3fb4-4934-a6a6-5943e734a347-kube-api-access-jm965\") pod \"certified-operators-5vtxf\" (UID: \"9d2d3bdb-3fb4-4934-a6a6-5943e734a347\") " pod="openshift-marketplace/certified-operators-5vtxf" Dec 05 20:35:49 crc kubenswrapper[4744]: I1205 20:35:49.262944 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2d3bdb-3fb4-4934-a6a6-5943e734a347-catalog-content\") pod \"certified-operators-5vtxf\" (UID: \"9d2d3bdb-3fb4-4934-a6a6-5943e734a347\") " pod="openshift-marketplace/certified-operators-5vtxf" Dec 05 20:35:49 crc kubenswrapper[4744]: I1205 20:35:49.263025 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2d3bdb-3fb4-4934-a6a6-5943e734a347-utilities\") pod \"certified-operators-5vtxf\" (UID: \"9d2d3bdb-3fb4-4934-a6a6-5943e734a347\") " pod="openshift-marketplace/certified-operators-5vtxf" Dec 05 20:35:49 crc kubenswrapper[4744]: I1205 20:35:49.263522 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2d3bdb-3fb4-4934-a6a6-5943e734a347-utilities\") pod \"certified-operators-5vtxf\" (UID: \"9d2d3bdb-3fb4-4934-a6a6-5943e734a347\") " pod="openshift-marketplace/certified-operators-5vtxf" Dec 05 20:35:49 crc kubenswrapper[4744]: I1205 20:35:49.263740 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2d3bdb-3fb4-4934-a6a6-5943e734a347-catalog-content\") pod \"certified-operators-5vtxf\" (UID: \"9d2d3bdb-3fb4-4934-a6a6-5943e734a347\") " pod="openshift-marketplace/certified-operators-5vtxf" Dec 05 20:35:49 crc kubenswrapper[4744]: I1205 20:35:49.289150 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm965\" (UniqueName: \"kubernetes.io/projected/9d2d3bdb-3fb4-4934-a6a6-5943e734a347-kube-api-access-jm965\") pod \"certified-operators-5vtxf\" (UID: \"9d2d3bdb-3fb4-4934-a6a6-5943e734a347\") " pod="openshift-marketplace/certified-operators-5vtxf" Dec 05 20:35:49 crc kubenswrapper[4744]: I1205 20:35:49.445964 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5vtxf" Dec 05 20:35:49 crc kubenswrapper[4744]: I1205 20:35:49.509124 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"33b283d3-4309-4156-a59f-3031bd597f19","Type":"ContainerStarted","Data":"6ed5d3542bab048604b0f262783be3d358c035ed8bc9938c2e44afd55482617d"} Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.010074 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.076402 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/545e6f9e-ddf0-43e6-b712-897b54463135-logs\") pod \"545e6f9e-ddf0-43e6-b712-897b54463135\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.076456 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7mc8\" (UniqueName: \"kubernetes.io/projected/545e6f9e-ddf0-43e6-b712-897b54463135-kube-api-access-s7mc8\") pod \"545e6f9e-ddf0-43e6-b712-897b54463135\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.076487 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-config-data\") pod \"545e6f9e-ddf0-43e6-b712-897b54463135\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.076549 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-combined-ca-bundle\") pod \"545e6f9e-ddf0-43e6-b712-897b54463135\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.076658 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-custom-prometheus-ca\") pod \"545e6f9e-ddf0-43e6-b712-897b54463135\" (UID: \"545e6f9e-ddf0-43e6-b712-897b54463135\") " Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.076691 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/545e6f9e-ddf0-43e6-b712-897b54463135-logs" (OuterVolumeSpecName: "logs") pod "545e6f9e-ddf0-43e6-b712-897b54463135" (UID: "545e6f9e-ddf0-43e6-b712-897b54463135"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.077020 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/545e6f9e-ddf0-43e6-b712-897b54463135-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.084449 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/545e6f9e-ddf0-43e6-b712-897b54463135-kube-api-access-s7mc8" (OuterVolumeSpecName: "kube-api-access-s7mc8") pod "545e6f9e-ddf0-43e6-b712-897b54463135" (UID: "545e6f9e-ddf0-43e6-b712-897b54463135"). InnerVolumeSpecName "kube-api-access-s7mc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.099653 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "545e6f9e-ddf0-43e6-b712-897b54463135" (UID: "545e6f9e-ddf0-43e6-b712-897b54463135"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.117039 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "545e6f9e-ddf0-43e6-b712-897b54463135" (UID: "545e6f9e-ddf0-43e6-b712-897b54463135"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.133378 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-config-data" (OuterVolumeSpecName: "config-data") pod "545e6f9e-ddf0-43e6-b712-897b54463135" (UID: "545e6f9e-ddf0-43e6-b712-897b54463135"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.180373 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.180414 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.180426 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7mc8\" (UniqueName: \"kubernetes.io/projected/545e6f9e-ddf0-43e6-b712-897b54463135-kube-api-access-s7mc8\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.180459 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545e6f9e-ddf0-43e6-b712-897b54463135-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.192796 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5vtxf"] Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.522853 4744 generic.go:334] "Generic (PLEG): container finished" podID="9d2d3bdb-3fb4-4934-a6a6-5943e734a347" containerID="849c56a386e79284c3ed57d016608e6a1713eda83ef7e82ca09401a4d4ca9349" exitCode=0 Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.524410 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vtxf" event={"ID":"9d2d3bdb-3fb4-4934-a6a6-5943e734a347","Type":"ContainerDied","Data":"849c56a386e79284c3ed57d016608e6a1713eda83ef7e82ca09401a4d4ca9349"} Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.524440 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vtxf" event={"ID":"9d2d3bdb-3fb4-4934-a6a6-5943e734a347","Type":"ContainerStarted","Data":"5f7bdba99d01d214d089da89a83857199b3ce15eba7b4f9f6ebfa3cd26851224"} Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.527146 4744 generic.go:334] "Generic (PLEG): container finished" podID="545e6f9e-ddf0-43e6-b712-897b54463135" containerID="a4688663e4fc5efe4bcd8bb454f4e6c054692cbbb8a506ebaf2deb646456aa4f" exitCode=0 Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.527220 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"545e6f9e-ddf0-43e6-b712-897b54463135","Type":"ContainerDied","Data":"a4688663e4fc5efe4bcd8bb454f4e6c054692cbbb8a506ebaf2deb646456aa4f"} Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.527252 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"545e6f9e-ddf0-43e6-b712-897b54463135","Type":"ContainerDied","Data":"5a1a219dea507276d6c1e64bb0ff912b906791623405ea21cba5c12ec4f72587"} Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.527271 4744 scope.go:117] "RemoveContainer" containerID="a4688663e4fc5efe4bcd8bb454f4e6c054692cbbb8a506ebaf2deb646456aa4f" Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.527439 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.534319 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"33b283d3-4309-4156-a59f-3031bd597f19","Type":"ContainerStarted","Data":"ff7e109aeb6cb3999df7d8fd0852d643ae7e4f972e917feadc2f51b0995506de"} Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.534362 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"33b283d3-4309-4156-a59f-3031bd597f19","Type":"ContainerStarted","Data":"b1487a3f8698b892ff44689000198532a30e9d0318b624f1adc839adad700556"} Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.576473 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.583256 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.584379 4744 scope.go:117] "RemoveContainer" containerID="a4688663e4fc5efe4bcd8bb454f4e6c054692cbbb8a506ebaf2deb646456aa4f" Dec 05 20:35:50 crc kubenswrapper[4744]: E1205 20:35:50.584688 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4688663e4fc5efe4bcd8bb454f4e6c054692cbbb8a506ebaf2deb646456aa4f\": container with ID starting with a4688663e4fc5efe4bcd8bb454f4e6c054692cbbb8a506ebaf2deb646456aa4f not found: ID does not exist" containerID="a4688663e4fc5efe4bcd8bb454f4e6c054692cbbb8a506ebaf2deb646456aa4f" Dec 05 20:35:50 crc kubenswrapper[4744]: I1205 20:35:50.584715 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4688663e4fc5efe4bcd8bb454f4e6c054692cbbb8a506ebaf2deb646456aa4f"} err="failed to get container status \"a4688663e4fc5efe4bcd8bb454f4e6c054692cbbb8a506ebaf2deb646456aa4f\": rpc error: code = NotFound desc = could not find container \"a4688663e4fc5efe4bcd8bb454f4e6c054692cbbb8a506ebaf2deb646456aa4f\": container with ID starting with a4688663e4fc5efe4bcd8bb454f4e6c054692cbbb8a506ebaf2deb646456aa4f not found: ID does not exist" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.635247 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m"] Dec 05 20:35:51 crc kubenswrapper[4744]: E1205 20:35:51.639419 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545e6f9e-ddf0-43e6-b712-897b54463135" containerName="watcher-decision-engine" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.639447 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="545e6f9e-ddf0-43e6-b712-897b54463135" containerName="watcher-decision-engine" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.640095 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="545e6f9e-ddf0-43e6-b712-897b54463135" containerName="watcher-decision-engine" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.641935 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.645654 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.652135 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-cpjvn"] Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.654437 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-cpjvn" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.684301 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-cpjvn"] Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.698537 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m"] Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.719897 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda-operator-scripts\") pod \"watcher-db-create-cpjvn\" (UID: \"0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda\") " pod="watcher-kuttl-default/watcher-db-create-cpjvn" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.720116 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbqbc\" (UniqueName: \"kubernetes.io/projected/9629e785-9003-4e51-9e0d-3081a14b6003-kube-api-access-qbqbc\") pod \"watcher-7cee-account-create-update-hdb9m\" (UID: \"9629e785-9003-4e51-9e0d-3081a14b6003\") " pod="watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.720187 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9629e785-9003-4e51-9e0d-3081a14b6003-operator-scripts\") pod \"watcher-7cee-account-create-update-hdb9m\" (UID: \"9629e785-9003-4e51-9e0d-3081a14b6003\") " pod="watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.720234 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvqnr\" (UniqueName: \"kubernetes.io/projected/0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda-kube-api-access-zvqnr\") pod \"watcher-db-create-cpjvn\" (UID: \"0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda\") " pod="watcher-kuttl-default/watcher-db-create-cpjvn" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.821673 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9629e785-9003-4e51-9e0d-3081a14b6003-operator-scripts\") pod \"watcher-7cee-account-create-update-hdb9m\" (UID: \"9629e785-9003-4e51-9e0d-3081a14b6003\") " pod="watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.821775 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvqnr\" (UniqueName: \"kubernetes.io/projected/0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda-kube-api-access-zvqnr\") pod \"watcher-db-create-cpjvn\" (UID: \"0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda\") " pod="watcher-kuttl-default/watcher-db-create-cpjvn" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.821957 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda-operator-scripts\") pod \"watcher-db-create-cpjvn\" (UID: \"0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda\") " pod="watcher-kuttl-default/watcher-db-create-cpjvn" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.822007 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbqbc\" (UniqueName: \"kubernetes.io/projected/9629e785-9003-4e51-9e0d-3081a14b6003-kube-api-access-qbqbc\") pod \"watcher-7cee-account-create-update-hdb9m\" (UID: \"9629e785-9003-4e51-9e0d-3081a14b6003\") " pod="watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.822702 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda-operator-scripts\") pod \"watcher-db-create-cpjvn\" (UID: \"0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda\") " pod="watcher-kuttl-default/watcher-db-create-cpjvn" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.822879 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9629e785-9003-4e51-9e0d-3081a14b6003-operator-scripts\") pod \"watcher-7cee-account-create-update-hdb9m\" (UID: \"9629e785-9003-4e51-9e0d-3081a14b6003\") " pod="watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.838411 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbqbc\" (UniqueName: \"kubernetes.io/projected/9629e785-9003-4e51-9e0d-3081a14b6003-kube-api-access-qbqbc\") pod \"watcher-7cee-account-create-update-hdb9m\" (UID: \"9629e785-9003-4e51-9e0d-3081a14b6003\") " pod="watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m" Dec 05 20:35:51 crc kubenswrapper[4744]: I1205 20:35:51.848518 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvqnr\" (UniqueName: \"kubernetes.io/projected/0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda-kube-api-access-zvqnr\") pod \"watcher-db-create-cpjvn\" (UID: \"0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda\") " pod="watcher-kuttl-default/watcher-db-create-cpjvn" Dec 05 20:35:52 crc kubenswrapper[4744]: I1205 20:35:52.042338 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m" Dec 05 20:35:52 crc kubenswrapper[4744]: I1205 20:35:52.052093 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-cpjvn" Dec 05 20:35:52 crc kubenswrapper[4744]: I1205 20:35:52.091963 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="545e6f9e-ddf0-43e6-b712-897b54463135" path="/var/lib/kubelet/pods/545e6f9e-ddf0-43e6-b712-897b54463135/volumes" Dec 05 20:35:52 crc kubenswrapper[4744]: I1205 20:35:52.554889 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-cpjvn"] Dec 05 20:35:52 crc kubenswrapper[4744]: I1205 20:35:52.572267 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"33b283d3-4309-4156-a59f-3031bd597f19","Type":"ContainerStarted","Data":"a6ff0fd7d97ea92a4172567ab5282c8cc540812d0363b224879e551b432ef567"} Dec 05 20:35:52 crc kubenswrapper[4744]: I1205 20:35:52.572663 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:35:52 crc kubenswrapper[4744]: I1205 20:35:52.611667 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.499822993 podStartE2EDuration="5.611638202s" podCreationTimestamp="2025-12-05 20:35:47 +0000 UTC" firstStartedPulling="2025-12-05 20:35:48.39318592 +0000 UTC m=+1518.622997328" lastFinishedPulling="2025-12-05 20:35:51.505001169 +0000 UTC m=+1521.734812537" observedRunningTime="2025-12-05 20:35:52.60012915 +0000 UTC m=+1522.829940518" watchObservedRunningTime="2025-12-05 20:35:52.611638202 +0000 UTC m=+1522.841449690" Dec 05 20:35:52 crc kubenswrapper[4744]: I1205 20:35:52.659490 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m"] Dec 05 20:35:53 crc kubenswrapper[4744]: I1205 20:35:53.582063 4744 generic.go:334] "Generic (PLEG): container finished" podID="0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda" containerID="ed8f9472f629c57d540ad8173288862985f4214695222941bf27f9f2da8817c9" exitCode=0 Dec 05 20:35:53 crc kubenswrapper[4744]: I1205 20:35:53.582373 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-cpjvn" event={"ID":"0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda","Type":"ContainerDied","Data":"ed8f9472f629c57d540ad8173288862985f4214695222941bf27f9f2da8817c9"} Dec 05 20:35:53 crc kubenswrapper[4744]: I1205 20:35:53.582398 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-cpjvn" event={"ID":"0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda","Type":"ContainerStarted","Data":"2c0d57e9a47adac3663d8ab8dfa0f8cc1b80d9663f2f37dabad09273788a7120"} Dec 05 20:35:53 crc kubenswrapper[4744]: I1205 20:35:53.584495 4744 generic.go:334] "Generic (PLEG): container finished" podID="9629e785-9003-4e51-9e0d-3081a14b6003" containerID="d3dc5d164c98cce0dd3794f22e4f0b9a0f892af71335254d651126c1503f99c3" exitCode=0 Dec 05 20:35:53 crc kubenswrapper[4744]: I1205 20:35:53.585805 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m" event={"ID":"9629e785-9003-4e51-9e0d-3081a14b6003","Type":"ContainerDied","Data":"d3dc5d164c98cce0dd3794f22e4f0b9a0f892af71335254d651126c1503f99c3"} Dec 05 20:35:53 crc kubenswrapper[4744]: I1205 20:35:53.585844 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m" event={"ID":"9629e785-9003-4e51-9e0d-3081a14b6003","Type":"ContainerStarted","Data":"c192c5770b2b7c44ae2490c283429ce7f013cf387b7378e9c11cfe7fda97ba0d"} Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.252590 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m" Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.275913 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-cpjvn" Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.375031 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9629e785-9003-4e51-9e0d-3081a14b6003-operator-scripts\") pod \"9629e785-9003-4e51-9e0d-3081a14b6003\" (UID: \"9629e785-9003-4e51-9e0d-3081a14b6003\") " Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.375362 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbqbc\" (UniqueName: \"kubernetes.io/projected/9629e785-9003-4e51-9e0d-3081a14b6003-kube-api-access-qbqbc\") pod \"9629e785-9003-4e51-9e0d-3081a14b6003\" (UID: \"9629e785-9003-4e51-9e0d-3081a14b6003\") " Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.376197 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9629e785-9003-4e51-9e0d-3081a14b6003-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9629e785-9003-4e51-9e0d-3081a14b6003" (UID: "9629e785-9003-4e51-9e0d-3081a14b6003"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.387364 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9629e785-9003-4e51-9e0d-3081a14b6003-kube-api-access-qbqbc" (OuterVolumeSpecName: "kube-api-access-qbqbc") pod "9629e785-9003-4e51-9e0d-3081a14b6003" (UID: "9629e785-9003-4e51-9e0d-3081a14b6003"). InnerVolumeSpecName "kube-api-access-qbqbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.477243 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda-operator-scripts\") pod \"0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda\" (UID: \"0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda\") " Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.477552 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvqnr\" (UniqueName: \"kubernetes.io/projected/0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda-kube-api-access-zvqnr\") pod \"0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda\" (UID: \"0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda\") " Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.477927 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9629e785-9003-4e51-9e0d-3081a14b6003-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.478021 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbqbc\" (UniqueName: \"kubernetes.io/projected/9629e785-9003-4e51-9e0d-3081a14b6003-kube-api-access-qbqbc\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.478458 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda" (UID: "0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.481417 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda-kube-api-access-zvqnr" (OuterVolumeSpecName: "kube-api-access-zvqnr") pod "0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda" (UID: "0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda"). InnerVolumeSpecName "kube-api-access-zvqnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.579505 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvqnr\" (UniqueName: \"kubernetes.io/projected/0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda-kube-api-access-zvqnr\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.579540 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.633673 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-cpjvn" event={"ID":"0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda","Type":"ContainerDied","Data":"2c0d57e9a47adac3663d8ab8dfa0f8cc1b80d9663f2f37dabad09273788a7120"} Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.633701 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-cpjvn" Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.633706 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c0d57e9a47adac3663d8ab8dfa0f8cc1b80d9663f2f37dabad09273788a7120" Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.637156 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m" Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.637169 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m" event={"ID":"9629e785-9003-4e51-9e0d-3081a14b6003","Type":"ContainerDied","Data":"c192c5770b2b7c44ae2490c283429ce7f013cf387b7378e9c11cfe7fda97ba0d"} Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.637227 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c192c5770b2b7c44ae2490c283429ce7f013cf387b7378e9c11cfe7fda97ba0d" Dec 05 20:35:57 crc kubenswrapper[4744]: I1205 20:35:57.639607 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vtxf" event={"ID":"9d2d3bdb-3fb4-4934-a6a6-5943e734a347","Type":"ContainerStarted","Data":"8b7836c67443cfeba315a57f01b3a643edbde8ec7c86692ff35c33ed3b11af49"} Dec 05 20:35:58 crc kubenswrapper[4744]: I1205 20:35:58.652407 4744 generic.go:334] "Generic (PLEG): container finished" podID="9d2d3bdb-3fb4-4934-a6a6-5943e734a347" containerID="8b7836c67443cfeba315a57f01b3a643edbde8ec7c86692ff35c33ed3b11af49" exitCode=0 Dec 05 20:35:58 crc kubenswrapper[4744]: I1205 20:35:58.652795 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vtxf" event={"ID":"9d2d3bdb-3fb4-4934-a6a6-5943e734a347","Type":"ContainerDied","Data":"8b7836c67443cfeba315a57f01b3a643edbde8ec7c86692ff35c33ed3b11af49"} Dec 05 20:36:00 crc kubenswrapper[4744]: I1205 20:36:00.671245 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vtxf" event={"ID":"9d2d3bdb-3fb4-4934-a6a6-5943e734a347","Type":"ContainerStarted","Data":"e01d9a2541f81313199c29febbe59587b6d6f2db88ea73efa1bc35800e34b58f"} Dec 05 20:36:01 crc kubenswrapper[4744]: I1205 20:36:01.701262 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5vtxf" podStartSLOduration=3.194873494 podStartE2EDuration="12.701238086s" podCreationTimestamp="2025-12-05 20:35:49 +0000 UTC" firstStartedPulling="2025-12-05 20:35:50.525324524 +0000 UTC m=+1520.755135902" lastFinishedPulling="2025-12-05 20:36:00.031689126 +0000 UTC m=+1530.261500494" observedRunningTime="2025-12-05 20:36:01.696585671 +0000 UTC m=+1531.926397049" watchObservedRunningTime="2025-12-05 20:36:01.701238086 +0000 UTC m=+1531.931049454" Dec 05 20:36:01 crc kubenswrapper[4744]: I1205 20:36:01.738769 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-md64s"] Dec 05 20:36:01 crc kubenswrapper[4744]: E1205 20:36:01.739143 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9629e785-9003-4e51-9e0d-3081a14b6003" containerName="mariadb-account-create-update" Dec 05 20:36:01 crc kubenswrapper[4744]: I1205 20:36:01.739164 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9629e785-9003-4e51-9e0d-3081a14b6003" containerName="mariadb-account-create-update" Dec 05 20:36:01 crc kubenswrapper[4744]: E1205 20:36:01.739210 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda" containerName="mariadb-database-create" Dec 05 20:36:01 crc kubenswrapper[4744]: I1205 20:36:01.739219 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda" containerName="mariadb-database-create" Dec 05 20:36:01 crc kubenswrapper[4744]: I1205 20:36:01.739440 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9629e785-9003-4e51-9e0d-3081a14b6003" containerName="mariadb-account-create-update" Dec 05 20:36:01 crc kubenswrapper[4744]: I1205 20:36:01.739471 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda" containerName="mariadb-database-create" Dec 05 20:36:01 crc kubenswrapper[4744]: I1205 20:36:01.740112 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:01 crc kubenswrapper[4744]: I1205 20:36:01.746391 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 20:36:01 crc kubenswrapper[4744]: I1205 20:36:01.746499 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-xwm6l" Dec 05 20:36:01 crc kubenswrapper[4744]: I1205 20:36:01.749624 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-md64s"] Dec 05 20:36:01 crc kubenswrapper[4744]: I1205 20:36:01.902030 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-config-data\") pod \"watcher-kuttl-db-sync-md64s\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:01 crc kubenswrapper[4744]: I1205 20:36:01.902145 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-db-sync-config-data\") pod \"watcher-kuttl-db-sync-md64s\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:01 crc kubenswrapper[4744]: I1205 20:36:01.902172 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-md64s\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:01 crc kubenswrapper[4744]: I1205 20:36:01.902424 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l6wc\" (UniqueName: \"kubernetes.io/projected/9c90c470-2c0c-42bb-8aaa-2716399201bf-kube-api-access-6l6wc\") pod \"watcher-kuttl-db-sync-md64s\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:02 crc kubenswrapper[4744]: I1205 20:36:02.003989 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l6wc\" (UniqueName: \"kubernetes.io/projected/9c90c470-2c0c-42bb-8aaa-2716399201bf-kube-api-access-6l6wc\") pod \"watcher-kuttl-db-sync-md64s\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:02 crc kubenswrapper[4744]: I1205 20:36:02.004116 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-config-data\") pod \"watcher-kuttl-db-sync-md64s\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:02 crc kubenswrapper[4744]: I1205 20:36:02.004201 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-db-sync-config-data\") pod \"watcher-kuttl-db-sync-md64s\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:02 crc kubenswrapper[4744]: I1205 20:36:02.004223 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-md64s\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:02 crc kubenswrapper[4744]: I1205 20:36:02.010051 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-md64s\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:02 crc kubenswrapper[4744]: I1205 20:36:02.010110 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-config-data\") pod \"watcher-kuttl-db-sync-md64s\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:02 crc kubenswrapper[4744]: I1205 20:36:02.027589 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-db-sync-config-data\") pod \"watcher-kuttl-db-sync-md64s\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:02 crc kubenswrapper[4744]: I1205 20:36:02.030053 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l6wc\" (UniqueName: \"kubernetes.io/projected/9c90c470-2c0c-42bb-8aaa-2716399201bf-kube-api-access-6l6wc\") pod \"watcher-kuttl-db-sync-md64s\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:02 crc kubenswrapper[4744]: I1205 20:36:02.106973 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:02 crc kubenswrapper[4744]: I1205 20:36:02.578429 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-md64s"] Dec 05 20:36:02 crc kubenswrapper[4744]: I1205 20:36:02.694134 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" event={"ID":"9c90c470-2c0c-42bb-8aaa-2716399201bf","Type":"ContainerStarted","Data":"abc70906a975553c68a76e962f80f7b0af45197666ef89c7259ee7e494c114ea"} Dec 05 20:36:03 crc kubenswrapper[4744]: I1205 20:36:03.704644 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" event={"ID":"9c90c470-2c0c-42bb-8aaa-2716399201bf","Type":"ContainerStarted","Data":"d839f00b5da1fdd7f5c48fcbcc62959786f1353040ce510d3f6bd821e83b0e9a"} Dec 05 20:36:03 crc kubenswrapper[4744]: I1205 20:36:03.731051 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" podStartSLOduration=2.7310296960000002 podStartE2EDuration="2.731029696s" podCreationTimestamp="2025-12-05 20:36:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:36:03.718506499 +0000 UTC m=+1533.948317887" watchObservedRunningTime="2025-12-05 20:36:03.731029696 +0000 UTC m=+1533.960841084" Dec 05 20:36:05 crc kubenswrapper[4744]: I1205 20:36:05.732131 4744 generic.go:334] "Generic (PLEG): container finished" podID="9c90c470-2c0c-42bb-8aaa-2716399201bf" containerID="d839f00b5da1fdd7f5c48fcbcc62959786f1353040ce510d3f6bd821e83b0e9a" exitCode=0 Dec 05 20:36:05 crc kubenswrapper[4744]: I1205 20:36:05.732237 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" event={"ID":"9c90c470-2c0c-42bb-8aaa-2716399201bf","Type":"ContainerDied","Data":"d839f00b5da1fdd7f5c48fcbcc62959786f1353040ce510d3f6bd821e83b0e9a"} Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.184718 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.286497 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l6wc\" (UniqueName: \"kubernetes.io/projected/9c90c470-2c0c-42bb-8aaa-2716399201bf-kube-api-access-6l6wc\") pod \"9c90c470-2c0c-42bb-8aaa-2716399201bf\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.286587 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-config-data\") pod \"9c90c470-2c0c-42bb-8aaa-2716399201bf\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.286673 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-db-sync-config-data\") pod \"9c90c470-2c0c-42bb-8aaa-2716399201bf\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.286725 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-combined-ca-bundle\") pod \"9c90c470-2c0c-42bb-8aaa-2716399201bf\" (UID: \"9c90c470-2c0c-42bb-8aaa-2716399201bf\") " Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.292202 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9c90c470-2c0c-42bb-8aaa-2716399201bf" (UID: "9c90c470-2c0c-42bb-8aaa-2716399201bf"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.293487 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c90c470-2c0c-42bb-8aaa-2716399201bf-kube-api-access-6l6wc" (OuterVolumeSpecName: "kube-api-access-6l6wc") pod "9c90c470-2c0c-42bb-8aaa-2716399201bf" (UID: "9c90c470-2c0c-42bb-8aaa-2716399201bf"). InnerVolumeSpecName "kube-api-access-6l6wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.326062 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-config-data" (OuterVolumeSpecName: "config-data") pod "9c90c470-2c0c-42bb-8aaa-2716399201bf" (UID: "9c90c470-2c0c-42bb-8aaa-2716399201bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.329616 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c90c470-2c0c-42bb-8aaa-2716399201bf" (UID: "9c90c470-2c0c-42bb-8aaa-2716399201bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.391496 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.391527 4744 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.391543 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c90c470-2c0c-42bb-8aaa-2716399201bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.391555 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l6wc\" (UniqueName: \"kubernetes.io/projected/9c90c470-2c0c-42bb-8aaa-2716399201bf-kube-api-access-6l6wc\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.783654 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" event={"ID":"9c90c470-2c0c-42bb-8aaa-2716399201bf","Type":"ContainerDied","Data":"abc70906a975553c68a76e962f80f7b0af45197666ef89c7259ee7e494c114ea"} Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.783991 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abc70906a975553c68a76e962f80f7b0af45197666ef89c7259ee7e494c114ea" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.783751 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-md64s" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.945936 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:36:07 crc kubenswrapper[4744]: E1205 20:36:07.946314 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c90c470-2c0c-42bb-8aaa-2716399201bf" containerName="watcher-kuttl-db-sync" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.946333 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c90c470-2c0c-42bb-8aaa-2716399201bf" containerName="watcher-kuttl-db-sync" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.946510 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c90c470-2c0c-42bb-8aaa-2716399201bf" containerName="watcher-kuttl-db-sync" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.948522 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.951430 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.951645 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-xwm6l" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.959416 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.960618 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.966211 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.973680 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:36:07 crc kubenswrapper[4744]: I1205 20:36:07.980726 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.092736 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.094203 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.097346 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.097635 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.097669 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.103165 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.105225 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fddddf3-b4be-4c28-a92f-87c7359418bc-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.105267 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.105310 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fddddf3-b4be-4c28-a92f-87c7359418bc-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.105349 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf94v\" (UniqueName: \"kubernetes.io/projected/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-kube-api-access-vf94v\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.105381 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7jxn\" (UniqueName: \"kubernetes.io/projected/9fddddf3-b4be-4c28-a92f-87c7359418bc-kube-api-access-h7jxn\") pod \"watcher-kuttl-applier-0\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.105425 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.105444 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.105505 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fddddf3-b4be-4c28-a92f-87c7359418bc-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.105537 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.207587 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.208397 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.208493 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.208601 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v98z\" (UniqueName: \"kubernetes.io/projected/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-kube-api-access-7v98z\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.208701 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fddddf3-b4be-4c28-a92f-87c7359418bc-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.208809 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.208901 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-logs\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.208986 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.209067 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fddddf3-b4be-4c28-a92f-87c7359418bc-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.209172 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.209178 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.209281 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.209428 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.209528 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fddddf3-b4be-4c28-a92f-87c7359418bc-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.209551 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fddddf3-b4be-4c28-a92f-87c7359418bc-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.209659 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf94v\" (UniqueName: \"kubernetes.io/projected/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-kube-api-access-vf94v\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.209715 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7jxn\" (UniqueName: \"kubernetes.io/projected/9fddddf3-b4be-4c28-a92f-87c7359418bc-kube-api-access-h7jxn\") pod \"watcher-kuttl-applier-0\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.209787 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.212919 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fddddf3-b4be-4c28-a92f-87c7359418bc-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.213086 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.213718 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.214053 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.221787 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fddddf3-b4be-4c28-a92f-87c7359418bc-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.228532 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7jxn\" (UniqueName: \"kubernetes.io/projected/9fddddf3-b4be-4c28-a92f-87c7359418bc-kube-api-access-h7jxn\") pod \"watcher-kuttl-applier-0\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.229923 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf94v\" (UniqueName: \"kubernetes.io/projected/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-kube-api-access-vf94v\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.265155 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.279576 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.311379 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-logs\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.312603 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.313305 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.313812 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.314009 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.314128 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.314347 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v98z\" (UniqueName: \"kubernetes.io/projected/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-kube-api-access-7v98z\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.312539 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-logs\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.318609 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.320392 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.321016 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.320395 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.322105 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.343222 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v98z\" (UniqueName: \"kubernetes.io/projected/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-kube-api-access-7v98z\") pod \"watcher-kuttl-api-0\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.412062 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.766067 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:36:08 crc kubenswrapper[4744]: W1205 20:36:08.770450 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6709afd_afa1_4ae1_bf43_bdf93b5bce55.slice/crio-b82972244f0ebb43c649c5488ea33409345587d1cde499eb8dfa038d8caf8fda WatchSource:0}: Error finding container b82972244f0ebb43c649c5488ea33409345587d1cde499eb8dfa038d8caf8fda: Status 404 returned error can't find the container with id b82972244f0ebb43c649c5488ea33409345587d1cde499eb8dfa038d8caf8fda Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.796555 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"a6709afd-afa1-4ae1-bf43-bdf93b5bce55","Type":"ContainerStarted","Data":"b82972244f0ebb43c649c5488ea33409345587d1cde499eb8dfa038d8caf8fda"} Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.838596 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:36:08 crc kubenswrapper[4744]: I1205 20:36:08.943342 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:36:09 crc kubenswrapper[4744]: I1205 20:36:09.446529 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5vtxf" Dec 05 20:36:09 crc kubenswrapper[4744]: I1205 20:36:09.446871 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5vtxf" Dec 05 20:36:09 crc kubenswrapper[4744]: I1205 20:36:09.497781 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5vtxf" Dec 05 20:36:09 crc kubenswrapper[4744]: I1205 20:36:09.818683 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d","Type":"ContainerStarted","Data":"f4cbdbc6408d4ce362ac62fc9ac43a51ade1669bc5c4173a2fd2a8ae955b913d"} Dec 05 20:36:09 crc kubenswrapper[4744]: I1205 20:36:09.818738 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d","Type":"ContainerStarted","Data":"55c48e1ec81606c484286d3c3a765b5f01b6b43847a6bde1f79d3b3c81cb240b"} Dec 05 20:36:09 crc kubenswrapper[4744]: I1205 20:36:09.850470 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"a6709afd-afa1-4ae1-bf43-bdf93b5bce55","Type":"ContainerStarted","Data":"d4d7b1c5cb5663bacb4fbf7c5e155389bef2ea5aa3ff1cdb529d7dffa52f5c5f"} Dec 05 20:36:09 crc kubenswrapper[4744]: I1205 20:36:09.878788 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.878765172 podStartE2EDuration="2.878765172s" podCreationTimestamp="2025-12-05 20:36:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:36:09.876590329 +0000 UTC m=+1540.106401697" watchObservedRunningTime="2025-12-05 20:36:09.878765172 +0000 UTC m=+1540.108576540" Dec 05 20:36:09 crc kubenswrapper[4744]: I1205 20:36:09.892558 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"9fddddf3-b4be-4c28-a92f-87c7359418bc","Type":"ContainerStarted","Data":"36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f"} Dec 05 20:36:09 crc kubenswrapper[4744]: I1205 20:36:09.892596 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"9fddddf3-b4be-4c28-a92f-87c7359418bc","Type":"ContainerStarted","Data":"d1443b08c519ebb097f610b1f8de7848c31307170e82fd78a820144c9d447430"} Dec 05 20:36:09 crc kubenswrapper[4744]: I1205 20:36:09.920115 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.920101777 podStartE2EDuration="2.920101777s" podCreationTimestamp="2025-12-05 20:36:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:36:09.913607137 +0000 UTC m=+1540.143418505" watchObservedRunningTime="2025-12-05 20:36:09.920101777 +0000 UTC m=+1540.149913145" Dec 05 20:36:10 crc kubenswrapper[4744]: I1205 20:36:10.000333 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5vtxf" Dec 05 20:36:10 crc kubenswrapper[4744]: I1205 20:36:10.944913 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d","Type":"ContainerStarted","Data":"7831fbe0bf564b5c5a38dde4a872b42ccf17d256133c7514e7f4b37ce9bc02ee"} Dec 05 20:36:10 crc kubenswrapper[4744]: I1205 20:36:10.945864 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:10 crc kubenswrapper[4744]: I1205 20:36:10.976703 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.97668045 podStartE2EDuration="2.97668045s" podCreationTimestamp="2025-12-05 20:36:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:36:10.965961957 +0000 UTC m=+1541.195773315" watchObservedRunningTime="2025-12-05 20:36:10.97668045 +0000 UTC m=+1541.206491818" Dec 05 20:36:12 crc kubenswrapper[4744]: I1205 20:36:12.502936 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5vtxf"] Dec 05 20:36:12 crc kubenswrapper[4744]: I1205 20:36:12.965042 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.059701 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4qjk5"] Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.059947 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4qjk5" podUID="082710f4-5dbe-49a3-a13a-1cc99036f530" containerName="registry-server" containerID="cri-o://309e7350dcb2714350f75e77034b6657ef6172335cabfdaef4787603078ba4a9" gracePeriod=2 Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.163839 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.279762 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.413486 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.550422 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.614199 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082710f4-5dbe-49a3-a13a-1cc99036f530-catalog-content\") pod \"082710f4-5dbe-49a3-a13a-1cc99036f530\" (UID: \"082710f4-5dbe-49a3-a13a-1cc99036f530\") " Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.614255 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6bkk\" (UniqueName: \"kubernetes.io/projected/082710f4-5dbe-49a3-a13a-1cc99036f530-kube-api-access-z6bkk\") pod \"082710f4-5dbe-49a3-a13a-1cc99036f530\" (UID: \"082710f4-5dbe-49a3-a13a-1cc99036f530\") " Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.614319 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082710f4-5dbe-49a3-a13a-1cc99036f530-utilities\") pod \"082710f4-5dbe-49a3-a13a-1cc99036f530\" (UID: \"082710f4-5dbe-49a3-a13a-1cc99036f530\") " Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.614867 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/082710f4-5dbe-49a3-a13a-1cc99036f530-utilities" (OuterVolumeSpecName: "utilities") pod "082710f4-5dbe-49a3-a13a-1cc99036f530" (UID: "082710f4-5dbe-49a3-a13a-1cc99036f530"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.630476 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/082710f4-5dbe-49a3-a13a-1cc99036f530-kube-api-access-z6bkk" (OuterVolumeSpecName: "kube-api-access-z6bkk") pod "082710f4-5dbe-49a3-a13a-1cc99036f530" (UID: "082710f4-5dbe-49a3-a13a-1cc99036f530"). InnerVolumeSpecName "kube-api-access-z6bkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.676104 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/082710f4-5dbe-49a3-a13a-1cc99036f530-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "082710f4-5dbe-49a3-a13a-1cc99036f530" (UID: "082710f4-5dbe-49a3-a13a-1cc99036f530"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.716673 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082710f4-5dbe-49a3-a13a-1cc99036f530-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.716719 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6bkk\" (UniqueName: \"kubernetes.io/projected/082710f4-5dbe-49a3-a13a-1cc99036f530-kube-api-access-z6bkk\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.716734 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082710f4-5dbe-49a3-a13a-1cc99036f530-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.975060 4744 generic.go:334] "Generic (PLEG): container finished" podID="082710f4-5dbe-49a3-a13a-1cc99036f530" containerID="309e7350dcb2714350f75e77034b6657ef6172335cabfdaef4787603078ba4a9" exitCode=0 Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.975143 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qjk5" event={"ID":"082710f4-5dbe-49a3-a13a-1cc99036f530","Type":"ContainerDied","Data":"309e7350dcb2714350f75e77034b6657ef6172335cabfdaef4787603078ba4a9"} Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.975158 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4qjk5" Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.975192 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qjk5" event={"ID":"082710f4-5dbe-49a3-a13a-1cc99036f530","Type":"ContainerDied","Data":"a008fd53811f986e7fdced29ab3bc47d1801b04e9571475b0c6c9fd2a794ccd4"} Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.975215 4744 scope.go:117] "RemoveContainer" containerID="309e7350dcb2714350f75e77034b6657ef6172335cabfdaef4787603078ba4a9" Dec 05 20:36:13 crc kubenswrapper[4744]: I1205 20:36:13.998920 4744 scope.go:117] "RemoveContainer" containerID="a08cfa27ca8e3b8aa019e0779774760fec6d8aa86e8900b12de981b64814b716" Dec 05 20:36:14 crc kubenswrapper[4744]: I1205 20:36:14.011383 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4qjk5"] Dec 05 20:36:14 crc kubenswrapper[4744]: I1205 20:36:14.025669 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4qjk5"] Dec 05 20:36:14 crc kubenswrapper[4744]: I1205 20:36:14.029751 4744 scope.go:117] "RemoveContainer" containerID="52645b7018014bfa5160b33867863d514fbe5296704413d262c6a2ca400f6fdf" Dec 05 20:36:14 crc kubenswrapper[4744]: I1205 20:36:14.058080 4744 scope.go:117] "RemoveContainer" containerID="309e7350dcb2714350f75e77034b6657ef6172335cabfdaef4787603078ba4a9" Dec 05 20:36:14 crc kubenswrapper[4744]: E1205 20:36:14.058576 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"309e7350dcb2714350f75e77034b6657ef6172335cabfdaef4787603078ba4a9\": container with ID starting with 309e7350dcb2714350f75e77034b6657ef6172335cabfdaef4787603078ba4a9 not found: ID does not exist" containerID="309e7350dcb2714350f75e77034b6657ef6172335cabfdaef4787603078ba4a9" Dec 05 20:36:14 crc kubenswrapper[4744]: I1205 20:36:14.058606 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"309e7350dcb2714350f75e77034b6657ef6172335cabfdaef4787603078ba4a9"} err="failed to get container status \"309e7350dcb2714350f75e77034b6657ef6172335cabfdaef4787603078ba4a9\": rpc error: code = NotFound desc = could not find container \"309e7350dcb2714350f75e77034b6657ef6172335cabfdaef4787603078ba4a9\": container with ID starting with 309e7350dcb2714350f75e77034b6657ef6172335cabfdaef4787603078ba4a9 not found: ID does not exist" Dec 05 20:36:14 crc kubenswrapper[4744]: I1205 20:36:14.058629 4744 scope.go:117] "RemoveContainer" containerID="a08cfa27ca8e3b8aa019e0779774760fec6d8aa86e8900b12de981b64814b716" Dec 05 20:36:14 crc kubenswrapper[4744]: E1205 20:36:14.058945 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a08cfa27ca8e3b8aa019e0779774760fec6d8aa86e8900b12de981b64814b716\": container with ID starting with a08cfa27ca8e3b8aa019e0779774760fec6d8aa86e8900b12de981b64814b716 not found: ID does not exist" containerID="a08cfa27ca8e3b8aa019e0779774760fec6d8aa86e8900b12de981b64814b716" Dec 05 20:36:14 crc kubenswrapper[4744]: I1205 20:36:14.059034 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a08cfa27ca8e3b8aa019e0779774760fec6d8aa86e8900b12de981b64814b716"} err="failed to get container status \"a08cfa27ca8e3b8aa019e0779774760fec6d8aa86e8900b12de981b64814b716\": rpc error: code = NotFound desc = could not find container \"a08cfa27ca8e3b8aa019e0779774760fec6d8aa86e8900b12de981b64814b716\": container with ID starting with a08cfa27ca8e3b8aa019e0779774760fec6d8aa86e8900b12de981b64814b716 not found: ID does not exist" Dec 05 20:36:14 crc kubenswrapper[4744]: I1205 20:36:14.059068 4744 scope.go:117] "RemoveContainer" containerID="52645b7018014bfa5160b33867863d514fbe5296704413d262c6a2ca400f6fdf" Dec 05 20:36:14 crc kubenswrapper[4744]: E1205 20:36:14.059399 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52645b7018014bfa5160b33867863d514fbe5296704413d262c6a2ca400f6fdf\": container with ID starting with 52645b7018014bfa5160b33867863d514fbe5296704413d262c6a2ca400f6fdf not found: ID does not exist" containerID="52645b7018014bfa5160b33867863d514fbe5296704413d262c6a2ca400f6fdf" Dec 05 20:36:14 crc kubenswrapper[4744]: I1205 20:36:14.059432 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52645b7018014bfa5160b33867863d514fbe5296704413d262c6a2ca400f6fdf"} err="failed to get container status \"52645b7018014bfa5160b33867863d514fbe5296704413d262c6a2ca400f6fdf\": rpc error: code = NotFound desc = could not find container \"52645b7018014bfa5160b33867863d514fbe5296704413d262c6a2ca400f6fdf\": container with ID starting with 52645b7018014bfa5160b33867863d514fbe5296704413d262c6a2ca400f6fdf not found: ID does not exist" Dec 05 20:36:14 crc kubenswrapper[4744]: I1205 20:36:14.090825 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="082710f4-5dbe-49a3-a13a-1cc99036f530" path="/var/lib/kubelet/pods/082710f4-5dbe-49a3-a13a-1cc99036f530/volumes" Dec 05 20:36:17 crc kubenswrapper[4744]: I1205 20:36:17.923182 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:18 crc kubenswrapper[4744]: I1205 20:36:18.266367 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:18 crc kubenswrapper[4744]: I1205 20:36:18.280456 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:18 crc kubenswrapper[4744]: I1205 20:36:18.298790 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:18 crc kubenswrapper[4744]: I1205 20:36:18.308034 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:18 crc kubenswrapper[4744]: I1205 20:36:18.413512 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:18 crc kubenswrapper[4744]: I1205 20:36:18.420706 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.020076 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.038993 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.051867 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.058507 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.273831 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8mdr9"] Dec 05 20:36:19 crc kubenswrapper[4744]: E1205 20:36:19.274236 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082710f4-5dbe-49a3-a13a-1cc99036f530" containerName="extract-content" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.274260 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="082710f4-5dbe-49a3-a13a-1cc99036f530" containerName="extract-content" Dec 05 20:36:19 crc kubenswrapper[4744]: E1205 20:36:19.274275 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082710f4-5dbe-49a3-a13a-1cc99036f530" containerName="extract-utilities" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.274289 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="082710f4-5dbe-49a3-a13a-1cc99036f530" containerName="extract-utilities" Dec 05 20:36:19 crc kubenswrapper[4744]: E1205 20:36:19.274323 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082710f4-5dbe-49a3-a13a-1cc99036f530" containerName="registry-server" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.274332 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="082710f4-5dbe-49a3-a13a-1cc99036f530" containerName="registry-server" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.274557 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="082710f4-5dbe-49a3-a13a-1cc99036f530" containerName="registry-server" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.276446 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.285904 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mdr9"] Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.316211 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c7575ce-0998-494d-9273-8c7624057354-catalog-content\") pod \"redhat-marketplace-8mdr9\" (UID: \"9c7575ce-0998-494d-9273-8c7624057354\") " pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.316280 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njdhw\" (UniqueName: \"kubernetes.io/projected/9c7575ce-0998-494d-9273-8c7624057354-kube-api-access-njdhw\") pod \"redhat-marketplace-8mdr9\" (UID: \"9c7575ce-0998-494d-9273-8c7624057354\") " pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.316338 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c7575ce-0998-494d-9273-8c7624057354-utilities\") pod \"redhat-marketplace-8mdr9\" (UID: \"9c7575ce-0998-494d-9273-8c7624057354\") " pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.417996 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njdhw\" (UniqueName: \"kubernetes.io/projected/9c7575ce-0998-494d-9273-8c7624057354-kube-api-access-njdhw\") pod \"redhat-marketplace-8mdr9\" (UID: \"9c7575ce-0998-494d-9273-8c7624057354\") " pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.418063 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c7575ce-0998-494d-9273-8c7624057354-utilities\") pod \"redhat-marketplace-8mdr9\" (UID: \"9c7575ce-0998-494d-9273-8c7624057354\") " pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.418146 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c7575ce-0998-494d-9273-8c7624057354-catalog-content\") pod \"redhat-marketplace-8mdr9\" (UID: \"9c7575ce-0998-494d-9273-8c7624057354\") " pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.418555 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c7575ce-0998-494d-9273-8c7624057354-catalog-content\") pod \"redhat-marketplace-8mdr9\" (UID: \"9c7575ce-0998-494d-9273-8c7624057354\") " pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.418849 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c7575ce-0998-494d-9273-8c7624057354-utilities\") pod \"redhat-marketplace-8mdr9\" (UID: \"9c7575ce-0998-494d-9273-8c7624057354\") " pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.436615 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njdhw\" (UniqueName: \"kubernetes.io/projected/9c7575ce-0998-494d-9273-8c7624057354-kube-api-access-njdhw\") pod \"redhat-marketplace-8mdr9\" (UID: \"9c7575ce-0998-494d-9273-8c7624057354\") " pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:19 crc kubenswrapper[4744]: I1205 20:36:19.595329 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:20 crc kubenswrapper[4744]: I1205 20:36:20.136417 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mdr9"] Dec 05 20:36:21 crc kubenswrapper[4744]: I1205 20:36:21.064603 4744 generic.go:334] "Generic (PLEG): container finished" podID="9c7575ce-0998-494d-9273-8c7624057354" containerID="7584ce84e29e61fc1a987b64a9f25b341e3199373e8c2c2c61788c8408c64256" exitCode=0 Dec 05 20:36:21 crc kubenswrapper[4744]: I1205 20:36:21.064693 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mdr9" event={"ID":"9c7575ce-0998-494d-9273-8c7624057354","Type":"ContainerDied","Data":"7584ce84e29e61fc1a987b64a9f25b341e3199373e8c2c2c61788c8408c64256"} Dec 05 20:36:21 crc kubenswrapper[4744]: I1205 20:36:21.064984 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mdr9" event={"ID":"9c7575ce-0998-494d-9273-8c7624057354","Type":"ContainerStarted","Data":"031f1a04a516ad2be2feb9460290e1ccad528ea8abb00bf609045efe66f4f92d"} Dec 05 20:36:21 crc kubenswrapper[4744]: I1205 20:36:21.909963 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:36:21 crc kubenswrapper[4744]: I1205 20:36:21.910801 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="proxy-httpd" containerID="cri-o://a6ff0fd7d97ea92a4172567ab5282c8cc540812d0363b224879e551b432ef567" gracePeriod=30 Dec 05 20:36:21 crc kubenswrapper[4744]: I1205 20:36:21.910807 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="sg-core" containerID="cri-o://ff7e109aeb6cb3999df7d8fd0852d643ae7e4f972e917feadc2f51b0995506de" gracePeriod=30 Dec 05 20:36:21 crc kubenswrapper[4744]: I1205 20:36:21.910840 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="ceilometer-notification-agent" containerID="cri-o://b1487a3f8698b892ff44689000198532a30e9d0318b624f1adc839adad700556" gracePeriod=30 Dec 05 20:36:21 crc kubenswrapper[4744]: I1205 20:36:21.910983 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="ceilometer-central-agent" containerID="cri-o://6ed5d3542bab048604b0f262783be3d358c035ed8bc9938c2e44afd55482617d" gracePeriod=30 Dec 05 20:36:22 crc kubenswrapper[4744]: I1205 20:36:22.075221 4744 generic.go:334] "Generic (PLEG): container finished" podID="9c7575ce-0998-494d-9273-8c7624057354" containerID="d76dd6b3651a26b7f41e67c15e12780e497a5308a164c5d98154425296b3eeed" exitCode=0 Dec 05 20:36:22 crc kubenswrapper[4744]: I1205 20:36:22.075280 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mdr9" event={"ID":"9c7575ce-0998-494d-9273-8c7624057354","Type":"ContainerDied","Data":"d76dd6b3651a26b7f41e67c15e12780e497a5308a164c5d98154425296b3eeed"} Dec 05 20:36:22 crc kubenswrapper[4744]: I1205 20:36:22.081318 4744 generic.go:334] "Generic (PLEG): container finished" podID="33b283d3-4309-4156-a59f-3031bd597f19" containerID="a6ff0fd7d97ea92a4172567ab5282c8cc540812d0363b224879e551b432ef567" exitCode=0 Dec 05 20:36:22 crc kubenswrapper[4744]: I1205 20:36:22.081340 4744 generic.go:334] "Generic (PLEG): container finished" podID="33b283d3-4309-4156-a59f-3031bd597f19" containerID="ff7e109aeb6cb3999df7d8fd0852d643ae7e4f972e917feadc2f51b0995506de" exitCode=2 Dec 05 20:36:22 crc kubenswrapper[4744]: I1205 20:36:22.091737 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"33b283d3-4309-4156-a59f-3031bd597f19","Type":"ContainerDied","Data":"a6ff0fd7d97ea92a4172567ab5282c8cc540812d0363b224879e551b432ef567"} Dec 05 20:36:22 crc kubenswrapper[4744]: I1205 20:36:22.091802 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"33b283d3-4309-4156-a59f-3031bd597f19","Type":"ContainerDied","Data":"ff7e109aeb6cb3999df7d8fd0852d643ae7e4f972e917feadc2f51b0995506de"} Dec 05 20:36:23 crc kubenswrapper[4744]: I1205 20:36:23.093180 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mdr9" event={"ID":"9c7575ce-0998-494d-9273-8c7624057354","Type":"ContainerStarted","Data":"7c07da39efab09b44009b571f142f7b7ff2fbff2c0739fa1910181787018e577"} Dec 05 20:36:23 crc kubenswrapper[4744]: I1205 20:36:23.096822 4744 generic.go:334] "Generic (PLEG): container finished" podID="33b283d3-4309-4156-a59f-3031bd597f19" containerID="6ed5d3542bab048604b0f262783be3d358c035ed8bc9938c2e44afd55482617d" exitCode=0 Dec 05 20:36:23 crc kubenswrapper[4744]: I1205 20:36:23.096890 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"33b283d3-4309-4156-a59f-3031bd597f19","Type":"ContainerDied","Data":"6ed5d3542bab048604b0f262783be3d358c035ed8bc9938c2e44afd55482617d"} Dec 05 20:36:23 crc kubenswrapper[4744]: I1205 20:36:23.117415 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8mdr9" podStartSLOduration=2.75001566 podStartE2EDuration="4.117400712s" podCreationTimestamp="2025-12-05 20:36:19 +0000 UTC" firstStartedPulling="2025-12-05 20:36:21.065991421 +0000 UTC m=+1551.295802789" lastFinishedPulling="2025-12-05 20:36:22.433376473 +0000 UTC m=+1552.663187841" observedRunningTime="2025-12-05 20:36:23.113518018 +0000 UTC m=+1553.343329386" watchObservedRunningTime="2025-12-05 20:36:23.117400712 +0000 UTC m=+1553.347212070" Dec 05 20:36:26 crc kubenswrapper[4744]: I1205 20:36:26.976311 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.061181 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-sg-core-conf-yaml\") pod \"33b283d3-4309-4156-a59f-3031bd597f19\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.061553 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33b283d3-4309-4156-a59f-3031bd597f19-run-httpd\") pod \"33b283d3-4309-4156-a59f-3031bd597f19\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.061594 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-ceilometer-tls-certs\") pod \"33b283d3-4309-4156-a59f-3031bd597f19\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.061713 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-scripts\") pod \"33b283d3-4309-4156-a59f-3031bd597f19\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.061758 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-config-data\") pod \"33b283d3-4309-4156-a59f-3031bd597f19\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.061778 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8s5m\" (UniqueName: \"kubernetes.io/projected/33b283d3-4309-4156-a59f-3031bd597f19-kube-api-access-t8s5m\") pod \"33b283d3-4309-4156-a59f-3031bd597f19\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.061810 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-combined-ca-bundle\") pod \"33b283d3-4309-4156-a59f-3031bd597f19\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.061866 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33b283d3-4309-4156-a59f-3031bd597f19-log-httpd\") pod \"33b283d3-4309-4156-a59f-3031bd597f19\" (UID: \"33b283d3-4309-4156-a59f-3031bd597f19\") " Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.062136 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33b283d3-4309-4156-a59f-3031bd597f19-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "33b283d3-4309-4156-a59f-3031bd597f19" (UID: "33b283d3-4309-4156-a59f-3031bd597f19"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.062326 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33b283d3-4309-4156-a59f-3031bd597f19-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.062531 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33b283d3-4309-4156-a59f-3031bd597f19-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "33b283d3-4309-4156-a59f-3031bd597f19" (UID: "33b283d3-4309-4156-a59f-3031bd597f19"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.070781 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33b283d3-4309-4156-a59f-3031bd597f19-kube-api-access-t8s5m" (OuterVolumeSpecName: "kube-api-access-t8s5m") pod "33b283d3-4309-4156-a59f-3031bd597f19" (UID: "33b283d3-4309-4156-a59f-3031bd597f19"). InnerVolumeSpecName "kube-api-access-t8s5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.095853 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-scripts" (OuterVolumeSpecName: "scripts") pod "33b283d3-4309-4156-a59f-3031bd597f19" (UID: "33b283d3-4309-4156-a59f-3031bd597f19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.147775 4744 generic.go:334] "Generic (PLEG): container finished" podID="33b283d3-4309-4156-a59f-3031bd597f19" containerID="b1487a3f8698b892ff44689000198532a30e9d0318b624f1adc839adad700556" exitCode=0 Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.147850 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.147845 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"33b283d3-4309-4156-a59f-3031bd597f19","Type":"ContainerDied","Data":"b1487a3f8698b892ff44689000198532a30e9d0318b624f1adc839adad700556"} Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.148000 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"33b283d3-4309-4156-a59f-3031bd597f19","Type":"ContainerDied","Data":"213d4aff9bf96285b40bd4436ed0ef97eee879d4f89aca49498c1453c771bf24"} Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.148054 4744 scope.go:117] "RemoveContainer" containerID="a6ff0fd7d97ea92a4172567ab5282c8cc540812d0363b224879e551b432ef567" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.150448 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "33b283d3-4309-4156-a59f-3031bd597f19" (UID: "33b283d3-4309-4156-a59f-3031bd597f19"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.157134 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "33b283d3-4309-4156-a59f-3031bd597f19" (UID: "33b283d3-4309-4156-a59f-3031bd597f19"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.164717 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.164743 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8s5m\" (UniqueName: \"kubernetes.io/projected/33b283d3-4309-4156-a59f-3031bd597f19-kube-api-access-t8s5m\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.164753 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33b283d3-4309-4156-a59f-3031bd597f19-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.164762 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.164790 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.185240 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33b283d3-4309-4156-a59f-3031bd597f19" (UID: "33b283d3-4309-4156-a59f-3031bd597f19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.185419 4744 scope.go:117] "RemoveContainer" containerID="ff7e109aeb6cb3999df7d8fd0852d643ae7e4f972e917feadc2f51b0995506de" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.204563 4744 scope.go:117] "RemoveContainer" containerID="b1487a3f8698b892ff44689000198532a30e9d0318b624f1adc839adad700556" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.215814 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-config-data" (OuterVolumeSpecName: "config-data") pod "33b283d3-4309-4156-a59f-3031bd597f19" (UID: "33b283d3-4309-4156-a59f-3031bd597f19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.234215 4744 scope.go:117] "RemoveContainer" containerID="6ed5d3542bab048604b0f262783be3d358c035ed8bc9938c2e44afd55482617d" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.251895 4744 scope.go:117] "RemoveContainer" containerID="a6ff0fd7d97ea92a4172567ab5282c8cc540812d0363b224879e551b432ef567" Dec 05 20:36:27 crc kubenswrapper[4744]: E1205 20:36:27.252313 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6ff0fd7d97ea92a4172567ab5282c8cc540812d0363b224879e551b432ef567\": container with ID starting with a6ff0fd7d97ea92a4172567ab5282c8cc540812d0363b224879e551b432ef567 not found: ID does not exist" containerID="a6ff0fd7d97ea92a4172567ab5282c8cc540812d0363b224879e551b432ef567" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.252364 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6ff0fd7d97ea92a4172567ab5282c8cc540812d0363b224879e551b432ef567"} err="failed to get container status \"a6ff0fd7d97ea92a4172567ab5282c8cc540812d0363b224879e551b432ef567\": rpc error: code = NotFound desc = could not find container \"a6ff0fd7d97ea92a4172567ab5282c8cc540812d0363b224879e551b432ef567\": container with ID starting with a6ff0fd7d97ea92a4172567ab5282c8cc540812d0363b224879e551b432ef567 not found: ID does not exist" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.252392 4744 scope.go:117] "RemoveContainer" containerID="ff7e109aeb6cb3999df7d8fd0852d643ae7e4f972e917feadc2f51b0995506de" Dec 05 20:36:27 crc kubenswrapper[4744]: E1205 20:36:27.252874 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff7e109aeb6cb3999df7d8fd0852d643ae7e4f972e917feadc2f51b0995506de\": container with ID starting with ff7e109aeb6cb3999df7d8fd0852d643ae7e4f972e917feadc2f51b0995506de not found: ID does not exist" containerID="ff7e109aeb6cb3999df7d8fd0852d643ae7e4f972e917feadc2f51b0995506de" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.252892 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7e109aeb6cb3999df7d8fd0852d643ae7e4f972e917feadc2f51b0995506de"} err="failed to get container status \"ff7e109aeb6cb3999df7d8fd0852d643ae7e4f972e917feadc2f51b0995506de\": rpc error: code = NotFound desc = could not find container \"ff7e109aeb6cb3999df7d8fd0852d643ae7e4f972e917feadc2f51b0995506de\": container with ID starting with ff7e109aeb6cb3999df7d8fd0852d643ae7e4f972e917feadc2f51b0995506de not found: ID does not exist" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.252907 4744 scope.go:117] "RemoveContainer" containerID="b1487a3f8698b892ff44689000198532a30e9d0318b624f1adc839adad700556" Dec 05 20:36:27 crc kubenswrapper[4744]: E1205 20:36:27.253190 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1487a3f8698b892ff44689000198532a30e9d0318b624f1adc839adad700556\": container with ID starting with b1487a3f8698b892ff44689000198532a30e9d0318b624f1adc839adad700556 not found: ID does not exist" containerID="b1487a3f8698b892ff44689000198532a30e9d0318b624f1adc839adad700556" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.253232 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1487a3f8698b892ff44689000198532a30e9d0318b624f1adc839adad700556"} err="failed to get container status \"b1487a3f8698b892ff44689000198532a30e9d0318b624f1adc839adad700556\": rpc error: code = NotFound desc = could not find container \"b1487a3f8698b892ff44689000198532a30e9d0318b624f1adc839adad700556\": container with ID starting with b1487a3f8698b892ff44689000198532a30e9d0318b624f1adc839adad700556 not found: ID does not exist" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.253262 4744 scope.go:117] "RemoveContainer" containerID="6ed5d3542bab048604b0f262783be3d358c035ed8bc9938c2e44afd55482617d" Dec 05 20:36:27 crc kubenswrapper[4744]: E1205 20:36:27.253507 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ed5d3542bab048604b0f262783be3d358c035ed8bc9938c2e44afd55482617d\": container with ID starting with 6ed5d3542bab048604b0f262783be3d358c035ed8bc9938c2e44afd55482617d not found: ID does not exist" containerID="6ed5d3542bab048604b0f262783be3d358c035ed8bc9938c2e44afd55482617d" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.253540 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ed5d3542bab048604b0f262783be3d358c035ed8bc9938c2e44afd55482617d"} err="failed to get container status \"6ed5d3542bab048604b0f262783be3d358c035ed8bc9938c2e44afd55482617d\": rpc error: code = NotFound desc = could not find container \"6ed5d3542bab048604b0f262783be3d358c035ed8bc9938c2e44afd55482617d\": container with ID starting with 6ed5d3542bab048604b0f262783be3d358c035ed8bc9938c2e44afd55482617d not found: ID does not exist" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.266553 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.266574 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b283d3-4309-4156-a59f-3031bd597f19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.484498 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.497520 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.519636 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:36:27 crc kubenswrapper[4744]: E1205 20:36:27.520112 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="ceilometer-central-agent" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.520135 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="ceilometer-central-agent" Dec 05 20:36:27 crc kubenswrapper[4744]: E1205 20:36:27.520157 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="sg-core" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.520167 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="sg-core" Dec 05 20:36:27 crc kubenswrapper[4744]: E1205 20:36:27.520208 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="ceilometer-notification-agent" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.520222 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="ceilometer-notification-agent" Dec 05 20:36:27 crc kubenswrapper[4744]: E1205 20:36:27.520241 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="proxy-httpd" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.520251 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="proxy-httpd" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.521538 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="ceilometer-central-agent" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.521562 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="sg-core" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.521579 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="ceilometer-notification-agent" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.521594 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b283d3-4309-4156-a59f-3031bd597f19" containerName="proxy-httpd" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.524787 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.528526 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.529179 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.529531 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.537807 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.570233 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2016fad5-7df3-474b-8322-7f8a81811556-run-httpd\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.570311 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6ljp\" (UniqueName: \"kubernetes.io/projected/2016fad5-7df3-474b-8322-7f8a81811556-kube-api-access-p6ljp\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.570351 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.570381 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.570449 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-scripts\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.570479 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-config-data\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.570508 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.570534 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2016fad5-7df3-474b-8322-7f8a81811556-log-httpd\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.671769 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-config-data\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.672075 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.672113 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2016fad5-7df3-474b-8322-7f8a81811556-log-httpd\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.672166 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2016fad5-7df3-474b-8322-7f8a81811556-run-httpd\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.672181 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ljp\" (UniqueName: \"kubernetes.io/projected/2016fad5-7df3-474b-8322-7f8a81811556-kube-api-access-p6ljp\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.672217 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.672248 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.672344 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-scripts\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.673470 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2016fad5-7df3-474b-8322-7f8a81811556-run-httpd\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.673695 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2016fad5-7df3-474b-8322-7f8a81811556-log-httpd\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.676325 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-scripts\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.676675 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.679038 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-config-data\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.681762 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.681994 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.693018 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6ljp\" (UniqueName: \"kubernetes.io/projected/2016fad5-7df3-474b-8322-7f8a81811556-kube-api-access-p6ljp\") pod \"ceilometer-0\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:27 crc kubenswrapper[4744]: I1205 20:36:27.885743 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:28 crc kubenswrapper[4744]: I1205 20:36:28.102991 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33b283d3-4309-4156-a59f-3031bd597f19" path="/var/lib/kubelet/pods/33b283d3-4309-4156-a59f-3031bd597f19/volumes" Dec 05 20:36:28 crc kubenswrapper[4744]: I1205 20:36:28.409894 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:36:28 crc kubenswrapper[4744]: W1205 20:36:28.417449 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2016fad5_7df3_474b_8322_7f8a81811556.slice/crio-07eaa7bea41a9a6abb8d228c952c6edf05666335dcb11aa9998f7549ece3075f WatchSource:0}: Error finding container 07eaa7bea41a9a6abb8d228c952c6edf05666335dcb11aa9998f7549ece3075f: Status 404 returned error can't find the container with id 07eaa7bea41a9a6abb8d228c952c6edf05666335dcb11aa9998f7549ece3075f Dec 05 20:36:29 crc kubenswrapper[4744]: I1205 20:36:29.221251 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2016fad5-7df3-474b-8322-7f8a81811556","Type":"ContainerStarted","Data":"07eaa7bea41a9a6abb8d228c952c6edf05666335dcb11aa9998f7549ece3075f"} Dec 05 20:36:29 crc kubenswrapper[4744]: I1205 20:36:29.595545 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:29 crc kubenswrapper[4744]: I1205 20:36:29.595600 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:29 crc kubenswrapper[4744]: I1205 20:36:29.651958 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.231557 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2016fad5-7df3-474b-8322-7f8a81811556","Type":"ContainerStarted","Data":"25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789"} Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.231872 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2016fad5-7df3-474b-8322-7f8a81811556","Type":"ContainerStarted","Data":"8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536"} Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.266732 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lrgd8"] Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.268447 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.288381 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrgd8"] Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.310266 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.420808 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/935c03c1-3eea-42e7-af3c-b243f498ad31-utilities\") pod \"community-operators-lrgd8\" (UID: \"935c03c1-3eea-42e7-af3c-b243f498ad31\") " pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.420871 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/935c03c1-3eea-42e7-af3c-b243f498ad31-catalog-content\") pod \"community-operators-lrgd8\" (UID: \"935c03c1-3eea-42e7-af3c-b243f498ad31\") " pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.421143 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z757z\" (UniqueName: \"kubernetes.io/projected/935c03c1-3eea-42e7-af3c-b243f498ad31-kube-api-access-z757z\") pod \"community-operators-lrgd8\" (UID: \"935c03c1-3eea-42e7-af3c-b243f498ad31\") " pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.523149 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/935c03c1-3eea-42e7-af3c-b243f498ad31-utilities\") pod \"community-operators-lrgd8\" (UID: \"935c03c1-3eea-42e7-af3c-b243f498ad31\") " pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.523201 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/935c03c1-3eea-42e7-af3c-b243f498ad31-catalog-content\") pod \"community-operators-lrgd8\" (UID: \"935c03c1-3eea-42e7-af3c-b243f498ad31\") " pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.523271 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z757z\" (UniqueName: \"kubernetes.io/projected/935c03c1-3eea-42e7-af3c-b243f498ad31-kube-api-access-z757z\") pod \"community-operators-lrgd8\" (UID: \"935c03c1-3eea-42e7-af3c-b243f498ad31\") " pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.523721 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/935c03c1-3eea-42e7-af3c-b243f498ad31-utilities\") pod \"community-operators-lrgd8\" (UID: \"935c03c1-3eea-42e7-af3c-b243f498ad31\") " pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.524445 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/935c03c1-3eea-42e7-af3c-b243f498ad31-catalog-content\") pod \"community-operators-lrgd8\" (UID: \"935c03c1-3eea-42e7-af3c-b243f498ad31\") " pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.543264 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z757z\" (UniqueName: \"kubernetes.io/projected/935c03c1-3eea-42e7-af3c-b243f498ad31-kube-api-access-z757z\") pod \"community-operators-lrgd8\" (UID: \"935c03c1-3eea-42e7-af3c-b243f498ad31\") " pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:30 crc kubenswrapper[4744]: I1205 20:36:30.584352 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:31 crc kubenswrapper[4744]: I1205 20:36:31.121228 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrgd8"] Dec 05 20:36:31 crc kubenswrapper[4744]: I1205 20:36:31.242032 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrgd8" event={"ID":"935c03c1-3eea-42e7-af3c-b243f498ad31","Type":"ContainerStarted","Data":"c4f51dce2eba0fef8f559c5b61c5f5f8377750cc18553e3f41b1118c9f07a1c3"} Dec 05 20:36:31 crc kubenswrapper[4744]: I1205 20:36:31.661157 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mdr9"] Dec 05 20:36:32 crc kubenswrapper[4744]: I1205 20:36:32.251423 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2016fad5-7df3-474b-8322-7f8a81811556","Type":"ContainerStarted","Data":"a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3"} Dec 05 20:36:32 crc kubenswrapper[4744]: I1205 20:36:32.252908 4744 generic.go:334] "Generic (PLEG): container finished" podID="935c03c1-3eea-42e7-af3c-b243f498ad31" containerID="ad111da2ae58170a878586f39ddfa3abf5c9f552de191a353d91180b09a27602" exitCode=0 Dec 05 20:36:32 crc kubenswrapper[4744]: I1205 20:36:32.252978 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrgd8" event={"ID":"935c03c1-3eea-42e7-af3c-b243f498ad31","Type":"ContainerDied","Data":"ad111da2ae58170a878586f39ddfa3abf5c9f552de191a353d91180b09a27602"} Dec 05 20:36:32 crc kubenswrapper[4744]: I1205 20:36:32.253283 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8mdr9" podUID="9c7575ce-0998-494d-9273-8c7624057354" containerName="registry-server" containerID="cri-o://7c07da39efab09b44009b571f142f7b7ff2fbff2c0739fa1910181787018e577" gracePeriod=2 Dec 05 20:36:32 crc kubenswrapper[4744]: I1205 20:36:32.773927 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 05 20:36:32 crc kubenswrapper[4744]: I1205 20:36:32.774645 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/memcached-0" podUID="541c0230-6b36-4415-b8c6-9307b6529783" containerName="memcached" containerID="cri-o://8c8cdebd28cf1a71333b0d7cd9bf63b3bf522de5a48ffb5c5962081988c8cf1e" gracePeriod=30 Dec 05 20:36:32 crc kubenswrapper[4744]: I1205 20:36:32.804889 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:36:32 crc kubenswrapper[4744]: I1205 20:36:32.805213 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" containerName="watcher-kuttl-api-log" containerID="cri-o://f4cbdbc6408d4ce362ac62fc9ac43a51ade1669bc5c4173a2fd2a8ae955b913d" gracePeriod=30 Dec 05 20:36:32 crc kubenswrapper[4744]: I1205 20:36:32.805452 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" containerName="watcher-api" containerID="cri-o://7831fbe0bf564b5c5a38dde4a872b42ccf17d256133c7514e7f4b37ce9bc02ee" gracePeriod=30 Dec 05 20:36:32 crc kubenswrapper[4744]: I1205 20:36:32.832807 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:36:32 crc kubenswrapper[4744]: I1205 20:36:32.833013 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="a6709afd-afa1-4ae1-bf43-bdf93b5bce55" containerName="watcher-decision-engine" containerID="cri-o://d4d7b1c5cb5663bacb4fbf7c5e155389bef2ea5aa3ff1cdb529d7dffa52f5c5f" gracePeriod=30 Dec 05 20:36:32 crc kubenswrapper[4744]: I1205 20:36:32.885666 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:36:32 crc kubenswrapper[4744]: I1205 20:36:32.886168 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="9fddddf3-b4be-4c28-a92f-87c7359418bc" containerName="watcher-applier" containerID="cri-o://36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f" gracePeriod=30 Dec 05 20:36:32 crc kubenswrapper[4744]: I1205 20:36:32.938596 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.056354 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-8fnjk"] Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.066271 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c7575ce-0998-494d-9273-8c7624057354-catalog-content\") pod \"9c7575ce-0998-494d-9273-8c7624057354\" (UID: \"9c7575ce-0998-494d-9273-8c7624057354\") " Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.066435 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c7575ce-0998-494d-9273-8c7624057354-utilities\") pod \"9c7575ce-0998-494d-9273-8c7624057354\" (UID: \"9c7575ce-0998-494d-9273-8c7624057354\") " Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.066551 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njdhw\" (UniqueName: \"kubernetes.io/projected/9c7575ce-0998-494d-9273-8c7624057354-kube-api-access-njdhw\") pod \"9c7575ce-0998-494d-9273-8c7624057354\" (UID: \"9c7575ce-0998-494d-9273-8c7624057354\") " Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.071023 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c7575ce-0998-494d-9273-8c7624057354-utilities" (OuterVolumeSpecName: "utilities") pod "9c7575ce-0998-494d-9273-8c7624057354" (UID: "9c7575ce-0998-494d-9273-8c7624057354"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.073568 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-8fnjk"] Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.080665 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c7575ce-0998-494d-9273-8c7624057354-kube-api-access-njdhw" (OuterVolumeSpecName: "kube-api-access-njdhw") pod "9c7575ce-0998-494d-9273-8c7624057354" (UID: "9c7575ce-0998-494d-9273-8c7624057354"). InnerVolumeSpecName "kube-api-access-njdhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.103845 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c7575ce-0998-494d-9273-8c7624057354-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c7575ce-0998-494d-9273-8c7624057354" (UID: "9c7575ce-0998-494d-9273-8c7624057354"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.129251 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-8ktjk"] Dec 05 20:36:33 crc kubenswrapper[4744]: E1205 20:36:33.129622 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c7575ce-0998-494d-9273-8c7624057354" containerName="registry-server" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.129633 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7575ce-0998-494d-9273-8c7624057354" containerName="registry-server" Dec 05 20:36:33 crc kubenswrapper[4744]: E1205 20:36:33.129643 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c7575ce-0998-494d-9273-8c7624057354" containerName="extract-content" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.129650 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7575ce-0998-494d-9273-8c7624057354" containerName="extract-content" Dec 05 20:36:33 crc kubenswrapper[4744]: E1205 20:36:33.129657 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c7575ce-0998-494d-9273-8c7624057354" containerName="extract-utilities" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.129663 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7575ce-0998-494d-9273-8c7624057354" containerName="extract-utilities" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.129828 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c7575ce-0998-494d-9273-8c7624057354" containerName="registry-server" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.130375 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.141151 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-8ktjk"] Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.151087 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"osp-secret" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.151246 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-memcached-mtls" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.169160 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-config-data\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.169461 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-fernet-keys\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.169586 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-scripts\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.169684 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-credential-keys\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.169772 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-combined-ca-bundle\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.169879 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-cert-memcached-mtls\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.169957 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5m9d\" (UniqueName: \"kubernetes.io/projected/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-kube-api-access-w5m9d\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.170062 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njdhw\" (UniqueName: \"kubernetes.io/projected/9c7575ce-0998-494d-9273-8c7624057354-kube-api-access-njdhw\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.170127 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c7575ce-0998-494d-9273-8c7624057354-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.170180 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c7575ce-0998-494d-9273-8c7624057354-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.271427 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-config-data\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.271497 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-fernet-keys\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.271538 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-scripts\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.271560 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-credential-keys\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.271578 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-combined-ca-bundle\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.271613 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-cert-memcached-mtls\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.271629 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5m9d\" (UniqueName: \"kubernetes.io/projected/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-kube-api-access-w5m9d\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.275190 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-config-data\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.275908 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-credential-keys\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.277855 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-combined-ca-bundle\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.279926 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-cert-memcached-mtls\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.280641 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-scripts\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: E1205 20:36:33.284875 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:36:33 crc kubenswrapper[4744]: E1205 20:36:33.286106 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.286509 4744 generic.go:334] "Generic (PLEG): container finished" podID="9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" containerID="f4cbdbc6408d4ce362ac62fc9ac43a51ade1669bc5c4173a2fd2a8ae955b913d" exitCode=143 Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.286567 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d","Type":"ContainerDied","Data":"f4cbdbc6408d4ce362ac62fc9ac43a51ade1669bc5c4173a2fd2a8ae955b913d"} Dec 05 20:36:33 crc kubenswrapper[4744]: E1205 20:36:33.299457 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:36:33 crc kubenswrapper[4744]: E1205 20:36:33.299541 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="9fddddf3-b4be-4c28-a92f-87c7359418bc" containerName="watcher-applier" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.300211 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-fernet-keys\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.301095 4744 generic.go:334] "Generic (PLEG): container finished" podID="9c7575ce-0998-494d-9273-8c7624057354" containerID="7c07da39efab09b44009b571f142f7b7ff2fbff2c0739fa1910181787018e577" exitCode=0 Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.301137 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mdr9" event={"ID":"9c7575ce-0998-494d-9273-8c7624057354","Type":"ContainerDied","Data":"7c07da39efab09b44009b571f142f7b7ff2fbff2c0739fa1910181787018e577"} Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.301155 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mdr9" event={"ID":"9c7575ce-0998-494d-9273-8c7624057354","Type":"ContainerDied","Data":"031f1a04a516ad2be2feb9460290e1ccad528ea8abb00bf609045efe66f4f92d"} Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.301172 4744 scope.go:117] "RemoveContainer" containerID="7c07da39efab09b44009b571f142f7b7ff2fbff2c0739fa1910181787018e577" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.301305 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mdr9" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.305281 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5m9d\" (UniqueName: \"kubernetes.io/projected/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-kube-api-access-w5m9d\") pod \"keystone-bootstrap-8ktjk\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.316680 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrgd8" event={"ID":"935c03c1-3eea-42e7-af3c-b243f498ad31","Type":"ContainerStarted","Data":"f3741ca1e87c6f491969a5a49b6fe359f2ed343e52072d64866e1123e01bf2d9"} Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.335789 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mdr9"] Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.338674 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mdr9"] Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.340006 4744 scope.go:117] "RemoveContainer" containerID="d76dd6b3651a26b7f41e67c15e12780e497a5308a164c5d98154425296b3eeed" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.340259 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2016fad5-7df3-474b-8322-7f8a81811556","Type":"ContainerStarted","Data":"497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b"} Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.340504 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.382016 4744 scope.go:117] "RemoveContainer" containerID="7584ce84e29e61fc1a987b64a9f25b341e3199373e8c2c2c61788c8408c64256" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.400112 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.231115866 podStartE2EDuration="6.400090027s" podCreationTimestamp="2025-12-05 20:36:27 +0000 UTC" firstStartedPulling="2025-12-05 20:36:28.420615488 +0000 UTC m=+1558.650426856" lastFinishedPulling="2025-12-05 20:36:32.589589649 +0000 UTC m=+1562.819401017" observedRunningTime="2025-12-05 20:36:33.37901966 +0000 UTC m=+1563.608831038" watchObservedRunningTime="2025-12-05 20:36:33.400090027 +0000 UTC m=+1563.629901395" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.423595 4744 scope.go:117] "RemoveContainer" containerID="7c07da39efab09b44009b571f142f7b7ff2fbff2c0739fa1910181787018e577" Dec 05 20:36:33 crc kubenswrapper[4744]: E1205 20:36:33.430510 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c07da39efab09b44009b571f142f7b7ff2fbff2c0739fa1910181787018e577\": container with ID starting with 7c07da39efab09b44009b571f142f7b7ff2fbff2c0739fa1910181787018e577 not found: ID does not exist" containerID="7c07da39efab09b44009b571f142f7b7ff2fbff2c0739fa1910181787018e577" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.430552 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c07da39efab09b44009b571f142f7b7ff2fbff2c0739fa1910181787018e577"} err="failed to get container status \"7c07da39efab09b44009b571f142f7b7ff2fbff2c0739fa1910181787018e577\": rpc error: code = NotFound desc = could not find container \"7c07da39efab09b44009b571f142f7b7ff2fbff2c0739fa1910181787018e577\": container with ID starting with 7c07da39efab09b44009b571f142f7b7ff2fbff2c0739fa1910181787018e577 not found: ID does not exist" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.430576 4744 scope.go:117] "RemoveContainer" containerID="d76dd6b3651a26b7f41e67c15e12780e497a5308a164c5d98154425296b3eeed" Dec 05 20:36:33 crc kubenswrapper[4744]: E1205 20:36:33.430921 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d76dd6b3651a26b7f41e67c15e12780e497a5308a164c5d98154425296b3eeed\": container with ID starting with d76dd6b3651a26b7f41e67c15e12780e497a5308a164c5d98154425296b3eeed not found: ID does not exist" containerID="d76dd6b3651a26b7f41e67c15e12780e497a5308a164c5d98154425296b3eeed" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.430940 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76dd6b3651a26b7f41e67c15e12780e497a5308a164c5d98154425296b3eeed"} err="failed to get container status \"d76dd6b3651a26b7f41e67c15e12780e497a5308a164c5d98154425296b3eeed\": rpc error: code = NotFound desc = could not find container \"d76dd6b3651a26b7f41e67c15e12780e497a5308a164c5d98154425296b3eeed\": container with ID starting with d76dd6b3651a26b7f41e67c15e12780e497a5308a164c5d98154425296b3eeed not found: ID does not exist" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.430952 4744 scope.go:117] "RemoveContainer" containerID="7584ce84e29e61fc1a987b64a9f25b341e3199373e8c2c2c61788c8408c64256" Dec 05 20:36:33 crc kubenswrapper[4744]: E1205 20:36:33.434188 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7584ce84e29e61fc1a987b64a9f25b341e3199373e8c2c2c61788c8408c64256\": container with ID starting with 7584ce84e29e61fc1a987b64a9f25b341e3199373e8c2c2c61788c8408c64256 not found: ID does not exist" containerID="7584ce84e29e61fc1a987b64a9f25b341e3199373e8c2c2c61788c8408c64256" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.434358 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7584ce84e29e61fc1a987b64a9f25b341e3199373e8c2c2c61788c8408c64256"} err="failed to get container status \"7584ce84e29e61fc1a987b64a9f25b341e3199373e8c2c2c61788c8408c64256\": rpc error: code = NotFound desc = could not find container \"7584ce84e29e61fc1a987b64a9f25b341e3199373e8c2c2c61788c8408c64256\": container with ID starting with 7584ce84e29e61fc1a987b64a9f25b341e3199373e8c2c2c61788c8408c64256 not found: ID does not exist" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.496676 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:33 crc kubenswrapper[4744]: I1205 20:36:33.937251 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-8ktjk"] Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.073043 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.097720 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c7575ce-0998-494d-9273-8c7624057354" path="/var/lib/kubelet/pods/9c7575ce-0998-494d-9273-8c7624057354/volumes" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.101886 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7db7edb-cbd0-485c-89c5-8f621cdf47df" path="/var/lib/kubelet/pods/b7db7edb-cbd0-485c-89c5-8f621cdf47df/volumes" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.206358 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541c0230-6b36-4415-b8c6-9307b6529783-combined-ca-bundle\") pod \"541c0230-6b36-4415-b8c6-9307b6529783\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.206428 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/541c0230-6b36-4415-b8c6-9307b6529783-kolla-config\") pod \"541c0230-6b36-4415-b8c6-9307b6529783\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.206607 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/541c0230-6b36-4415-b8c6-9307b6529783-memcached-tls-certs\") pod \"541c0230-6b36-4415-b8c6-9307b6529783\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.207027 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7shd\" (UniqueName: \"kubernetes.io/projected/541c0230-6b36-4415-b8c6-9307b6529783-kube-api-access-j7shd\") pod \"541c0230-6b36-4415-b8c6-9307b6529783\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.207092 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/541c0230-6b36-4415-b8c6-9307b6529783-config-data\") pod \"541c0230-6b36-4415-b8c6-9307b6529783\" (UID: \"541c0230-6b36-4415-b8c6-9307b6529783\") " Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.207269 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/541c0230-6b36-4415-b8c6-9307b6529783-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "541c0230-6b36-4415-b8c6-9307b6529783" (UID: "541c0230-6b36-4415-b8c6-9307b6529783"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.207624 4744 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/541c0230-6b36-4415-b8c6-9307b6529783-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.208803 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/541c0230-6b36-4415-b8c6-9307b6529783-config-data" (OuterVolumeSpecName: "config-data") pod "541c0230-6b36-4415-b8c6-9307b6529783" (UID: "541c0230-6b36-4415-b8c6-9307b6529783"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.212490 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541c0230-6b36-4415-b8c6-9307b6529783-kube-api-access-j7shd" (OuterVolumeSpecName: "kube-api-access-j7shd") pod "541c0230-6b36-4415-b8c6-9307b6529783" (UID: "541c0230-6b36-4415-b8c6-9307b6529783"). InnerVolumeSpecName "kube-api-access-j7shd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.229715 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541c0230-6b36-4415-b8c6-9307b6529783-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "541c0230-6b36-4415-b8c6-9307b6529783" (UID: "541c0230-6b36-4415-b8c6-9307b6529783"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.259154 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541c0230-6b36-4415-b8c6-9307b6529783-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "541c0230-6b36-4415-b8c6-9307b6529783" (UID: "541c0230-6b36-4415-b8c6-9307b6529783"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.309610 4744 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/541c0230-6b36-4415-b8c6-9307b6529783-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.309639 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7shd\" (UniqueName: \"kubernetes.io/projected/541c0230-6b36-4415-b8c6-9307b6529783-kube-api-access-j7shd\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.309650 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/541c0230-6b36-4415-b8c6-9307b6529783-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.309658 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541c0230-6b36-4415-b8c6-9307b6529783-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.350912 4744 generic.go:334] "Generic (PLEG): container finished" podID="935c03c1-3eea-42e7-af3c-b243f498ad31" containerID="f3741ca1e87c6f491969a5a49b6fe359f2ed343e52072d64866e1123e01bf2d9" exitCode=0 Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.351003 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrgd8" event={"ID":"935c03c1-3eea-42e7-af3c-b243f498ad31","Type":"ContainerDied","Data":"f3741ca1e87c6f491969a5a49b6fe359f2ed343e52072d64866e1123e01bf2d9"} Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.352768 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" event={"ID":"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d","Type":"ContainerStarted","Data":"50c232af572cf057632a973e1b339c0dc214c5f0f2c0ea3f1145c19d187b9198"} Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.352803 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" event={"ID":"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d","Type":"ContainerStarted","Data":"41d738df3412448406c078908901c017d0a26f0a15df4169682188d6cecfe5a5"} Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.356812 4744 generic.go:334] "Generic (PLEG): container finished" podID="541c0230-6b36-4415-b8c6-9307b6529783" containerID="8c8cdebd28cf1a71333b0d7cd9bf63b3bf522de5a48ffb5c5962081988c8cf1e" exitCode=0 Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.356906 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"541c0230-6b36-4415-b8c6-9307b6529783","Type":"ContainerDied","Data":"8c8cdebd28cf1a71333b0d7cd9bf63b3bf522de5a48ffb5c5962081988c8cf1e"} Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.356951 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"541c0230-6b36-4415-b8c6-9307b6529783","Type":"ContainerDied","Data":"e0218214aac5d2361ea00d88932191aca09246f4d875ea8365dfbba716ce5b38"} Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.356968 4744 scope.go:117] "RemoveContainer" containerID="8c8cdebd28cf1a71333b0d7cd9bf63b3bf522de5a48ffb5c5962081988c8cf1e" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.356920 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.386995 4744 scope.go:117] "RemoveContainer" containerID="8c8cdebd28cf1a71333b0d7cd9bf63b3bf522de5a48ffb5c5962081988c8cf1e" Dec 05 20:36:34 crc kubenswrapper[4744]: E1205 20:36:34.387670 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c8cdebd28cf1a71333b0d7cd9bf63b3bf522de5a48ffb5c5962081988c8cf1e\": container with ID starting with 8c8cdebd28cf1a71333b0d7cd9bf63b3bf522de5a48ffb5c5962081988c8cf1e not found: ID does not exist" containerID="8c8cdebd28cf1a71333b0d7cd9bf63b3bf522de5a48ffb5c5962081988c8cf1e" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.387699 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c8cdebd28cf1a71333b0d7cd9bf63b3bf522de5a48ffb5c5962081988c8cf1e"} err="failed to get container status \"8c8cdebd28cf1a71333b0d7cd9bf63b3bf522de5a48ffb5c5962081988c8cf1e\": rpc error: code = NotFound desc = could not find container \"8c8cdebd28cf1a71333b0d7cd9bf63b3bf522de5a48ffb5c5962081988c8cf1e\": container with ID starting with 8c8cdebd28cf1a71333b0d7cd9bf63b3bf522de5a48ffb5c5962081988c8cf1e not found: ID does not exist" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.400353 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" podStartSLOduration=1.400329592 podStartE2EDuration="1.400329592s" podCreationTimestamp="2025-12-05 20:36:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:36:34.399001549 +0000 UTC m=+1564.628812917" watchObservedRunningTime="2025-12-05 20:36:34.400329592 +0000 UTC m=+1564.630140960" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.416904 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.428336 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.428357 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.163:9322/\": read tcp 10.217.0.2:37752->10.217.0.163:9322: read: connection reset by peer" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.428399 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"https://10.217.0.163:9322/\": read tcp 10.217.0.2:37768->10.217.0.163:9322: read: connection reset by peer" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.436917 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 05 20:36:34 crc kubenswrapper[4744]: E1205 20:36:34.437255 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541c0230-6b36-4415-b8c6-9307b6529783" containerName="memcached" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.437272 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="541c0230-6b36-4415-b8c6-9307b6529783" containerName="memcached" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.437450 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="541c0230-6b36-4415-b8c6-9307b6529783" containerName="memcached" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.438004 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.443733 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"memcached-config-data" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.443873 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"memcached-memcached-dockercfg-r7jzb" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.443912 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-memcached-svc" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.462519 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.514069 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww6zj\" (UniqueName: \"kubernetes.io/projected/21a4ed1e-1e04-482c-a036-dc690da56572-kube-api-access-ww6zj\") pod \"memcached-0\" (UID: \"21a4ed1e-1e04-482c-a036-dc690da56572\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.514113 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/21a4ed1e-1e04-482c-a036-dc690da56572-memcached-tls-certs\") pod \"memcached-0\" (UID: \"21a4ed1e-1e04-482c-a036-dc690da56572\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.514163 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21a4ed1e-1e04-482c-a036-dc690da56572-config-data\") pod \"memcached-0\" (UID: \"21a4ed1e-1e04-482c-a036-dc690da56572\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.514329 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21a4ed1e-1e04-482c-a036-dc690da56572-kolla-config\") pod \"memcached-0\" (UID: \"21a4ed1e-1e04-482c-a036-dc690da56572\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.514519 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a4ed1e-1e04-482c-a036-dc690da56572-combined-ca-bundle\") pod \"memcached-0\" (UID: \"21a4ed1e-1e04-482c-a036-dc690da56572\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.617077 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a4ed1e-1e04-482c-a036-dc690da56572-combined-ca-bundle\") pod \"memcached-0\" (UID: \"21a4ed1e-1e04-482c-a036-dc690da56572\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.617517 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww6zj\" (UniqueName: \"kubernetes.io/projected/21a4ed1e-1e04-482c-a036-dc690da56572-kube-api-access-ww6zj\") pod \"memcached-0\" (UID: \"21a4ed1e-1e04-482c-a036-dc690da56572\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.617647 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/21a4ed1e-1e04-482c-a036-dc690da56572-memcached-tls-certs\") pod \"memcached-0\" (UID: \"21a4ed1e-1e04-482c-a036-dc690da56572\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.617790 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21a4ed1e-1e04-482c-a036-dc690da56572-config-data\") pod \"memcached-0\" (UID: \"21a4ed1e-1e04-482c-a036-dc690da56572\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.617984 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21a4ed1e-1e04-482c-a036-dc690da56572-kolla-config\") pod \"memcached-0\" (UID: \"21a4ed1e-1e04-482c-a036-dc690da56572\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.618835 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21a4ed1e-1e04-482c-a036-dc690da56572-kolla-config\") pod \"memcached-0\" (UID: \"21a4ed1e-1e04-482c-a036-dc690da56572\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.618854 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21a4ed1e-1e04-482c-a036-dc690da56572-config-data\") pod \"memcached-0\" (UID: \"21a4ed1e-1e04-482c-a036-dc690da56572\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.621398 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a4ed1e-1e04-482c-a036-dc690da56572-combined-ca-bundle\") pod \"memcached-0\" (UID: \"21a4ed1e-1e04-482c-a036-dc690da56572\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.621662 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/21a4ed1e-1e04-482c-a036-dc690da56572-memcached-tls-certs\") pod \"memcached-0\" (UID: \"21a4ed1e-1e04-482c-a036-dc690da56572\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.633754 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww6zj\" (UniqueName: \"kubernetes.io/projected/21a4ed1e-1e04-482c-a036-dc690da56572-kube-api-access-ww6zj\") pod \"memcached-0\" (UID: \"21a4ed1e-1e04-482c-a036-dc690da56572\") " pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:34 crc kubenswrapper[4744]: I1205 20:36:34.830668 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.051797 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.134773 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-public-tls-certs\") pod \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.134884 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-config-data\") pod \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.134988 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-logs\") pod \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.135021 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-combined-ca-bundle\") pod \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.135042 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v98z\" (UniqueName: \"kubernetes.io/projected/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-kube-api-access-7v98z\") pod \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.135063 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-custom-prometheus-ca\") pod \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.135092 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-internal-tls-certs\") pod \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\" (UID: \"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d\") " Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.138306 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-logs" (OuterVolumeSpecName: "logs") pod "9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" (UID: "9b69f9d3-ca24-4a6c-af4c-70722cb8d30d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.149647 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-kube-api-access-7v98z" (OuterVolumeSpecName: "kube-api-access-7v98z") pod "9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" (UID: "9b69f9d3-ca24-4a6c-af4c-70722cb8d30d"). InnerVolumeSpecName "kube-api-access-7v98z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.171510 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" (UID: "9b69f9d3-ca24-4a6c-af4c-70722cb8d30d"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.171588 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" (UID: "9b69f9d3-ca24-4a6c-af4c-70722cb8d30d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.183702 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" (UID: "9b69f9d3-ca24-4a6c-af4c-70722cb8d30d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.192495 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-config-data" (OuterVolumeSpecName: "config-data") pod "9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" (UID: "9b69f9d3-ca24-4a6c-af4c-70722cb8d30d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.215840 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" (UID: "9b69f9d3-ca24-4a6c-af4c-70722cb8d30d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.236561 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.236598 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.236608 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v98z\" (UniqueName: \"kubernetes.io/projected/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-kube-api-access-7v98z\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.236616 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.236625 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.236633 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.236641 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.343109 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.373348 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrgd8" event={"ID":"935c03c1-3eea-42e7-af3c-b243f498ad31","Type":"ContainerStarted","Data":"ef0fdbb364240728020829f7418cbc40bef6e0d269dfbe671eb5f6f12faa59d8"} Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.374860 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"21a4ed1e-1e04-482c-a036-dc690da56572","Type":"ContainerStarted","Data":"006c5760d1a7f570a6a34ca87eca61d85a1b4bfbcd0dc217f08490a3c0137b65"} Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.380480 4744 generic.go:334] "Generic (PLEG): container finished" podID="9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" containerID="7831fbe0bf564b5c5a38dde4a872b42ccf17d256133c7514e7f4b37ce9bc02ee" exitCode=0 Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.380549 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.380590 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d","Type":"ContainerDied","Data":"7831fbe0bf564b5c5a38dde4a872b42ccf17d256133c7514e7f4b37ce9bc02ee"} Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.380626 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9b69f9d3-ca24-4a6c-af4c-70722cb8d30d","Type":"ContainerDied","Data":"55c48e1ec81606c484286d3c3a765b5f01b6b43847a6bde1f79d3b3c81cb240b"} Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.380649 4744 scope.go:117] "RemoveContainer" containerID="7831fbe0bf564b5c5a38dde4a872b42ccf17d256133c7514e7f4b37ce9bc02ee" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.405408 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lrgd8" podStartSLOduration=2.829001564 podStartE2EDuration="5.405387865s" podCreationTimestamp="2025-12-05 20:36:30 +0000 UTC" firstStartedPulling="2025-12-05 20:36:32.254846535 +0000 UTC m=+1562.484657913" lastFinishedPulling="2025-12-05 20:36:34.831232846 +0000 UTC m=+1565.061044214" observedRunningTime="2025-12-05 20:36:35.395662586 +0000 UTC m=+1565.625473964" watchObservedRunningTime="2025-12-05 20:36:35.405387865 +0000 UTC m=+1565.635199233" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.437217 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.442282 4744 scope.go:117] "RemoveContainer" containerID="f4cbdbc6408d4ce362ac62fc9ac43a51ade1669bc5c4173a2fd2a8ae955b913d" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.444478 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.464726 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:36:35 crc kubenswrapper[4744]: E1205 20:36:35.465644 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" containerName="watcher-kuttl-api-log" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.465668 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" containerName="watcher-kuttl-api-log" Dec 05 20:36:35 crc kubenswrapper[4744]: E1205 20:36:35.465726 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" containerName="watcher-api" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.465734 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" containerName="watcher-api" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.465896 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" containerName="watcher-api" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.465918 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" containerName="watcher-kuttl-api-log" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.466725 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.471792 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.472176 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.472345 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.490580 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.491124 4744 scope.go:117] "RemoveContainer" containerID="7831fbe0bf564b5c5a38dde4a872b42ccf17d256133c7514e7f4b37ce9bc02ee" Dec 05 20:36:35 crc kubenswrapper[4744]: E1205 20:36:35.491662 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7831fbe0bf564b5c5a38dde4a872b42ccf17d256133c7514e7f4b37ce9bc02ee\": container with ID starting with 7831fbe0bf564b5c5a38dde4a872b42ccf17d256133c7514e7f4b37ce9bc02ee not found: ID does not exist" containerID="7831fbe0bf564b5c5a38dde4a872b42ccf17d256133c7514e7f4b37ce9bc02ee" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.491692 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7831fbe0bf564b5c5a38dde4a872b42ccf17d256133c7514e7f4b37ce9bc02ee"} err="failed to get container status \"7831fbe0bf564b5c5a38dde4a872b42ccf17d256133c7514e7f4b37ce9bc02ee\": rpc error: code = NotFound desc = could not find container \"7831fbe0bf564b5c5a38dde4a872b42ccf17d256133c7514e7f4b37ce9bc02ee\": container with ID starting with 7831fbe0bf564b5c5a38dde4a872b42ccf17d256133c7514e7f4b37ce9bc02ee not found: ID does not exist" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.491716 4744 scope.go:117] "RemoveContainer" containerID="f4cbdbc6408d4ce362ac62fc9ac43a51ade1669bc5c4173a2fd2a8ae955b913d" Dec 05 20:36:35 crc kubenswrapper[4744]: E1205 20:36:35.492050 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4cbdbc6408d4ce362ac62fc9ac43a51ade1669bc5c4173a2fd2a8ae955b913d\": container with ID starting with f4cbdbc6408d4ce362ac62fc9ac43a51ade1669bc5c4173a2fd2a8ae955b913d not found: ID does not exist" containerID="f4cbdbc6408d4ce362ac62fc9ac43a51ade1669bc5c4173a2fd2a8ae955b913d" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.492085 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4cbdbc6408d4ce362ac62fc9ac43a51ade1669bc5c4173a2fd2a8ae955b913d"} err="failed to get container status \"f4cbdbc6408d4ce362ac62fc9ac43a51ade1669bc5c4173a2fd2a8ae955b913d\": rpc error: code = NotFound desc = could not find container \"f4cbdbc6408d4ce362ac62fc9ac43a51ade1669bc5c4173a2fd2a8ae955b913d\": container with ID starting with f4cbdbc6408d4ce362ac62fc9ac43a51ade1669bc5c4173a2fd2a8ae955b913d not found: ID does not exist" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.546248 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.546324 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5qhq\" (UniqueName: \"kubernetes.io/projected/919b5c2d-000e-4fb0-ae17-7e4258cf323c-kube-api-access-h5qhq\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.546360 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.546389 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.546434 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.546452 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.546471 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/919b5c2d-000e-4fb0-ae17-7e4258cf323c-logs\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.546500 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.649006 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/919b5c2d-000e-4fb0-ae17-7e4258cf323c-logs\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.649076 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.649119 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.649161 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5qhq\" (UniqueName: \"kubernetes.io/projected/919b5c2d-000e-4fb0-ae17-7e4258cf323c-kube-api-access-h5qhq\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.649201 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.649238 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.649327 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.649354 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.650816 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/919b5c2d-000e-4fb0-ae17-7e4258cf323c-logs\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.654972 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.655771 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.660010 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.660159 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.661583 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.661784 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.700635 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5qhq\" (UniqueName: \"kubernetes.io/projected/919b5c2d-000e-4fb0-ae17-7e4258cf323c-kube-api-access-h5qhq\") pod \"watcher-kuttl-api-0\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.780942 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.855186 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fddddf3-b4be-4c28-a92f-87c7359418bc-logs\") pod \"9fddddf3-b4be-4c28-a92f-87c7359418bc\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.855368 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7jxn\" (UniqueName: \"kubernetes.io/projected/9fddddf3-b4be-4c28-a92f-87c7359418bc-kube-api-access-h7jxn\") pod \"9fddddf3-b4be-4c28-a92f-87c7359418bc\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.855413 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fddddf3-b4be-4c28-a92f-87c7359418bc-combined-ca-bundle\") pod \"9fddddf3-b4be-4c28-a92f-87c7359418bc\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.855481 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fddddf3-b4be-4c28-a92f-87c7359418bc-config-data\") pod \"9fddddf3-b4be-4c28-a92f-87c7359418bc\" (UID: \"9fddddf3-b4be-4c28-a92f-87c7359418bc\") " Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.857843 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fddddf3-b4be-4c28-a92f-87c7359418bc-logs" (OuterVolumeSpecName: "logs") pod "9fddddf3-b4be-4c28-a92f-87c7359418bc" (UID: "9fddddf3-b4be-4c28-a92f-87c7359418bc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.861469 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.861839 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fddddf3-b4be-4c28-a92f-87c7359418bc-kube-api-access-h7jxn" (OuterVolumeSpecName: "kube-api-access-h7jxn") pod "9fddddf3-b4be-4c28-a92f-87c7359418bc" (UID: "9fddddf3-b4be-4c28-a92f-87c7359418bc"). InnerVolumeSpecName "kube-api-access-h7jxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.911573 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fddddf3-b4be-4c28-a92f-87c7359418bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fddddf3-b4be-4c28-a92f-87c7359418bc" (UID: "9fddddf3-b4be-4c28-a92f-87c7359418bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.915360 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fddddf3-b4be-4c28-a92f-87c7359418bc-config-data" (OuterVolumeSpecName: "config-data") pod "9fddddf3-b4be-4c28-a92f-87c7359418bc" (UID: "9fddddf3-b4be-4c28-a92f-87c7359418bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.959376 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fddddf3-b4be-4c28-a92f-87c7359418bc-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.959415 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7jxn\" (UniqueName: \"kubernetes.io/projected/9fddddf3-b4be-4c28-a92f-87c7359418bc-kube-api-access-h7jxn\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.959428 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fddddf3-b4be-4c28-a92f-87c7359418bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:35 crc kubenswrapper[4744]: I1205 20:36:35.959438 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fddddf3-b4be-4c28-a92f-87c7359418bc-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.094846 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="541c0230-6b36-4415-b8c6-9307b6529783" path="/var/lib/kubelet/pods/541c0230-6b36-4415-b8c6-9307b6529783/volumes" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.095436 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b69f9d3-ca24-4a6c-af4c-70722cb8d30d" path="/var/lib/kubelet/pods/9b69f9d3-ca24-4a6c-af4c-70722cb8d30d/volumes" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.354797 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.402439 4744 generic.go:334] "Generic (PLEG): container finished" podID="9fddddf3-b4be-4c28-a92f-87c7359418bc" containerID="36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f" exitCode=0 Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.402510 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.402494 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"9fddddf3-b4be-4c28-a92f-87c7359418bc","Type":"ContainerDied","Data":"36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f"} Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.402643 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"9fddddf3-b4be-4c28-a92f-87c7359418bc","Type":"ContainerDied","Data":"d1443b08c519ebb097f610b1f8de7848c31307170e82fd78a820144c9d447430"} Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.402667 4744 scope.go:117] "RemoveContainer" containerID="36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.419844 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"21a4ed1e-1e04-482c-a036-dc690da56572","Type":"ContainerStarted","Data":"4a2d92cde77576346fe9edaf2f1d0f3b87a69a2b5d64dae7590b7d11c2a67a91"} Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.420710 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.440521 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.446151 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"919b5c2d-000e-4fb0-ae17-7e4258cf323c","Type":"ContainerStarted","Data":"2d6c2d40b298b56db2806947b687b8da0faa9dfed60034632093c3d986190190"} Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.453796 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.454615 4744 scope.go:117] "RemoveContainer" containerID="36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f" Dec 05 20:36:36 crc kubenswrapper[4744]: E1205 20:36:36.458817 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f\": container with ID starting with 36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f not found: ID does not exist" containerID="36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.458864 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f"} err="failed to get container status \"36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f\": rpc error: code = NotFound desc = could not find container \"36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f\": container with ID starting with 36501811ab4afd42f792a0a8d8e192fb65db567da98cb1752c2f19253d9b019f not found: ID does not exist" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.465958 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:36:36 crc kubenswrapper[4744]: E1205 20:36:36.466304 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fddddf3-b4be-4c28-a92f-87c7359418bc" containerName="watcher-applier" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.466318 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fddddf3-b4be-4c28-a92f-87c7359418bc" containerName="watcher-applier" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.466514 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fddddf3-b4be-4c28-a92f-87c7359418bc" containerName="watcher-applier" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.467180 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.473679 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.484879 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.491407 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/memcached-0" podStartSLOduration=2.491391243 podStartE2EDuration="2.491391243s" podCreationTimestamp="2025-12-05 20:36:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:36:36.458835284 +0000 UTC m=+1566.688646652" watchObservedRunningTime="2025-12-05 20:36:36.491391243 +0000 UTC m=+1566.721202611" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.572472 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c02b7d0f-79b3-4d57-8599-b0a077a1747f-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.572575 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.572644 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.572674 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.572690 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rspqh\" (UniqueName: \"kubernetes.io/projected/c02b7d0f-79b3-4d57-8599-b0a077a1747f-kube-api-access-rspqh\") pod \"watcher-kuttl-applier-0\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.674410 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.674461 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.674493 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.674511 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rspqh\" (UniqueName: \"kubernetes.io/projected/c02b7d0f-79b3-4d57-8599-b0a077a1747f-kube-api-access-rspqh\") pod \"watcher-kuttl-applier-0\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.674574 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c02b7d0f-79b3-4d57-8599-b0a077a1747f-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.674993 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c02b7d0f-79b3-4d57-8599-b0a077a1747f-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.679806 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.679951 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.681707 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.693168 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rspqh\" (UniqueName: \"kubernetes.io/projected/c02b7d0f-79b3-4d57-8599-b0a077a1747f-kube-api-access-rspqh\") pod \"watcher-kuttl-applier-0\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.814307 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.816494 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.880763 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf94v\" (UniqueName: \"kubernetes.io/projected/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-kube-api-access-vf94v\") pod \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.889723 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-logs\") pod \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.889922 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-config-data\") pod \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.890047 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-combined-ca-bundle\") pod \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.890203 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-custom-prometheus-ca\") pod \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\" (UID: \"a6709afd-afa1-4ae1-bf43-bdf93b5bce55\") " Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.892518 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-logs" (OuterVolumeSpecName: "logs") pod "a6709afd-afa1-4ae1-bf43-bdf93b5bce55" (UID: "a6709afd-afa1-4ae1-bf43-bdf93b5bce55"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.899069 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-kube-api-access-vf94v" (OuterVolumeSpecName: "kube-api-access-vf94v") pod "a6709afd-afa1-4ae1-bf43-bdf93b5bce55" (UID: "a6709afd-afa1-4ae1-bf43-bdf93b5bce55"). InnerVolumeSpecName "kube-api-access-vf94v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.924072 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6709afd-afa1-4ae1-bf43-bdf93b5bce55" (UID: "a6709afd-afa1-4ae1-bf43-bdf93b5bce55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.934528 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "a6709afd-afa1-4ae1-bf43-bdf93b5bce55" (UID: "a6709afd-afa1-4ae1-bf43-bdf93b5bce55"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.947403 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-config-data" (OuterVolumeSpecName: "config-data") pod "a6709afd-afa1-4ae1-bf43-bdf93b5bce55" (UID: "a6709afd-afa1-4ae1-bf43-bdf93b5bce55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.992403 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.992432 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.992441 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf94v\" (UniqueName: \"kubernetes.io/projected/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-kube-api-access-vf94v\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.992453 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:36 crc kubenswrapper[4744]: I1205 20:36:36.992461 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6709afd-afa1-4ae1-bf43-bdf93b5bce55-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.345230 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.459136 4744 generic.go:334] "Generic (PLEG): container finished" podID="b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d" containerID="50c232af572cf057632a973e1b339c0dc214c5f0f2c0ea3f1145c19d187b9198" exitCode=0 Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.459229 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" event={"ID":"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d","Type":"ContainerDied","Data":"50c232af572cf057632a973e1b339c0dc214c5f0f2c0ea3f1145c19d187b9198"} Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.461919 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"919b5c2d-000e-4fb0-ae17-7e4258cf323c","Type":"ContainerStarted","Data":"977ba308d2c1e702ea95ffd070803eb9e359b7c97e6097b6c96e9a3be0288aaf"} Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.461953 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"919b5c2d-000e-4fb0-ae17-7e4258cf323c","Type":"ContainerStarted","Data":"b8ffc87dc94b10eda35da98b36180382371ef3aaa70e5cc582771657e126b292"} Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.462160 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.463321 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"c02b7d0f-79b3-4d57-8599-b0a077a1747f","Type":"ContainerStarted","Data":"d7429fa526fb56ccc4c5cb056c08c22cf979b6c31b95295d12b8e60d8d70cd15"} Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.486749 4744 generic.go:334] "Generic (PLEG): container finished" podID="a6709afd-afa1-4ae1-bf43-bdf93b5bce55" containerID="d4d7b1c5cb5663bacb4fbf7c5e155389bef2ea5aa3ff1cdb529d7dffa52f5c5f" exitCode=0 Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.486861 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"a6709afd-afa1-4ae1-bf43-bdf93b5bce55","Type":"ContainerDied","Data":"d4d7b1c5cb5663bacb4fbf7c5e155389bef2ea5aa3ff1cdb529d7dffa52f5c5f"} Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.486895 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"a6709afd-afa1-4ae1-bf43-bdf93b5bce55","Type":"ContainerDied","Data":"b82972244f0ebb43c649c5488ea33409345587d1cde499eb8dfa038d8caf8fda"} Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.486915 4744 scope.go:117] "RemoveContainer" containerID="d4d7b1c5cb5663bacb4fbf7c5e155389bef2ea5aa3ff1cdb529d7dffa52f5c5f" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.487053 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.511034 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.5110091839999997 podStartE2EDuration="2.511009184s" podCreationTimestamp="2025-12-05 20:36:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:36:37.508197455 +0000 UTC m=+1567.738008853" watchObservedRunningTime="2025-12-05 20:36:37.511009184 +0000 UTC m=+1567.740820552" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.524283 4744 scope.go:117] "RemoveContainer" containerID="d4d7b1c5cb5663bacb4fbf7c5e155389bef2ea5aa3ff1cdb529d7dffa52f5c5f" Dec 05 20:36:37 crc kubenswrapper[4744]: E1205 20:36:37.525050 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d7b1c5cb5663bacb4fbf7c5e155389bef2ea5aa3ff1cdb529d7dffa52f5c5f\": container with ID starting with d4d7b1c5cb5663bacb4fbf7c5e155389bef2ea5aa3ff1cdb529d7dffa52f5c5f not found: ID does not exist" containerID="d4d7b1c5cb5663bacb4fbf7c5e155389bef2ea5aa3ff1cdb529d7dffa52f5c5f" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.525099 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d7b1c5cb5663bacb4fbf7c5e155389bef2ea5aa3ff1cdb529d7dffa52f5c5f"} err="failed to get container status \"d4d7b1c5cb5663bacb4fbf7c5e155389bef2ea5aa3ff1cdb529d7dffa52f5c5f\": rpc error: code = NotFound desc = could not find container \"d4d7b1c5cb5663bacb4fbf7c5e155389bef2ea5aa3ff1cdb529d7dffa52f5c5f\": container with ID starting with d4d7b1c5cb5663bacb4fbf7c5e155389bef2ea5aa3ff1cdb529d7dffa52f5c5f not found: ID does not exist" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.553812 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.563148 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.579701 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:36:37 crc kubenswrapper[4744]: E1205 20:36:37.580321 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6709afd-afa1-4ae1-bf43-bdf93b5bce55" containerName="watcher-decision-engine" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.580403 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6709afd-afa1-4ae1-bf43-bdf93b5bce55" containerName="watcher-decision-engine" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.580608 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6709afd-afa1-4ae1-bf43-bdf93b5bce55" containerName="watcher-decision-engine" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.581192 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.583363 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.592519 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.704038 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4978e864-b896-4580-a9cf-796c8d465b8a-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.704124 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.704272 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.704381 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zqld\" (UniqueName: \"kubernetes.io/projected/4978e864-b896-4580-a9cf-796c8d465b8a-kube-api-access-2zqld\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.704453 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.704493 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.806391 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4978e864-b896-4580-a9cf-796c8d465b8a-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.806465 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.806501 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.806530 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zqld\" (UniqueName: \"kubernetes.io/projected/4978e864-b896-4580-a9cf-796c8d465b8a-kube-api-access-2zqld\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.806561 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.806582 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.807511 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4978e864-b896-4580-a9cf-796c8d465b8a-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.811351 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.811451 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.813560 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.823154 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.826384 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zqld\" (UniqueName: \"kubernetes.io/projected/4978e864-b896-4580-a9cf-796c8d465b8a-kube-api-access-2zqld\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:37 crc kubenswrapper[4744]: I1205 20:36:37.897658 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.111501 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fddddf3-b4be-4c28-a92f-87c7359418bc" path="/var/lib/kubelet/pods/9fddddf3-b4be-4c28-a92f-87c7359418bc/volumes" Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.112665 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6709afd-afa1-4ae1-bf43-bdf93b5bce55" path="/var/lib/kubelet/pods/a6709afd-afa1-4ae1-bf43-bdf93b5bce55/volumes" Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.353240 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:36:38 crc kubenswrapper[4744]: W1205 20:36:38.367723 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4978e864_b896_4580_a9cf_796c8d465b8a.slice/crio-3e969e08d8aef610d8aa63c179413e1044d12985e6e6b5541434de068fed294c WatchSource:0}: Error finding container 3e969e08d8aef610d8aa63c179413e1044d12985e6e6b5541434de068fed294c: Status 404 returned error can't find the container with id 3e969e08d8aef610d8aa63c179413e1044d12985e6e6b5541434de068fed294c Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.508283 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"4978e864-b896-4580-a9cf-796c8d465b8a","Type":"ContainerStarted","Data":"3e969e08d8aef610d8aa63c179413e1044d12985e6e6b5541434de068fed294c"} Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.512513 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"c02b7d0f-79b3-4d57-8599-b0a077a1747f","Type":"ContainerStarted","Data":"6f46e9b143e77a5a322e6a7547ecb495adf5875ee3c3fefaf8c84b30e61e1ff5"} Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.536681 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.536663291 podStartE2EDuration="2.536663291s" podCreationTimestamp="2025-12-05 20:36:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:36:38.53212008 +0000 UTC m=+1568.761931458" watchObservedRunningTime="2025-12-05 20:36:38.536663291 +0000 UTC m=+1568.766474659" Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.792746 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.929709 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-cert-memcached-mtls\") pod \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.929996 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-config-data\") pod \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.930032 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-credential-keys\") pod \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.930051 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5m9d\" (UniqueName: \"kubernetes.io/projected/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-kube-api-access-w5m9d\") pod \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.930132 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-scripts\") pod \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.930156 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-fernet-keys\") pod \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.930207 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-combined-ca-bundle\") pod \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\" (UID: \"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d\") " Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.940034 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d" (UID: "b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.941453 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-kube-api-access-w5m9d" (OuterVolumeSpecName: "kube-api-access-w5m9d") pod "b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d" (UID: "b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d"). InnerVolumeSpecName "kube-api-access-w5m9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.942594 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-scripts" (OuterVolumeSpecName: "scripts") pod "b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d" (UID: "b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.947416 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d" (UID: "b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.963590 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-config-data" (OuterVolumeSpecName: "config-data") pod "b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d" (UID: "b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.968164 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d" (UID: "b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:38 crc kubenswrapper[4744]: I1205 20:36:38.991487 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d" (UID: "b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:39 crc kubenswrapper[4744]: I1205 20:36:39.031900 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:39 crc kubenswrapper[4744]: I1205 20:36:39.031933 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:39 crc kubenswrapper[4744]: I1205 20:36:39.031941 4744 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:39 crc kubenswrapper[4744]: I1205 20:36:39.031951 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5m9d\" (UniqueName: \"kubernetes.io/projected/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-kube-api-access-w5m9d\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:39 crc kubenswrapper[4744]: I1205 20:36:39.031961 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:39 crc kubenswrapper[4744]: I1205 20:36:39.031969 4744 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:39 crc kubenswrapper[4744]: I1205 20:36:39.031977 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:39 crc kubenswrapper[4744]: I1205 20:36:39.527872 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" Dec 05 20:36:39 crc kubenswrapper[4744]: I1205 20:36:39.527870 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-8ktjk" event={"ID":"b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d","Type":"ContainerDied","Data":"41d738df3412448406c078908901c017d0a26f0a15df4169682188d6cecfe5a5"} Dec 05 20:36:39 crc kubenswrapper[4744]: I1205 20:36:39.527929 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41d738df3412448406c078908901c017d0a26f0a15df4169682188d6cecfe5a5" Dec 05 20:36:39 crc kubenswrapper[4744]: I1205 20:36:39.534702 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"4978e864-b896-4580-a9cf-796c8d465b8a","Type":"ContainerStarted","Data":"b5417a2f9d6daa4ff819c2650d9c6bee168a458c4c0ed892c21a7cd9ad34871e"} Dec 05 20:36:39 crc kubenswrapper[4744]: I1205 20:36:39.558576 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.558559077 podStartE2EDuration="2.558559077s" podCreationTimestamp="2025-12-05 20:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:36:39.552494059 +0000 UTC m=+1569.782305427" watchObservedRunningTime="2025-12-05 20:36:39.558559077 +0000 UTC m=+1569.788370445" Dec 05 20:36:39 crc kubenswrapper[4744]: I1205 20:36:39.797517 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:40 crc kubenswrapper[4744]: I1205 20:36:40.584878 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:40 crc kubenswrapper[4744]: I1205 20:36:40.585215 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:40 crc kubenswrapper[4744]: I1205 20:36:40.641976 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:40 crc kubenswrapper[4744]: I1205 20:36:40.862329 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:41 crc kubenswrapper[4744]: I1205 20:36:41.632846 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:41 crc kubenswrapper[4744]: E1205 20:36:41.745382 4744 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.51:33796->38.102.83.51:42651: write tcp 38.102.83.51:33796->38.102.83.51:42651: write: broken pipe Dec 05 20:36:41 crc kubenswrapper[4744]: I1205 20:36:41.815207 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:42 crc kubenswrapper[4744]: I1205 20:36:42.260940 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lrgd8"] Dec 05 20:36:43 crc kubenswrapper[4744]: I1205 20:36:43.570257 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lrgd8" podUID="935c03c1-3eea-42e7-af3c-b243f498ad31" containerName="registry-server" containerID="cri-o://ef0fdbb364240728020829f7418cbc40bef6e0d269dfbe671eb5f6f12faa59d8" gracePeriod=2 Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.039350 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.122801 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/935c03c1-3eea-42e7-af3c-b243f498ad31-utilities\") pod \"935c03c1-3eea-42e7-af3c-b243f498ad31\" (UID: \"935c03c1-3eea-42e7-af3c-b243f498ad31\") " Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.122952 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/935c03c1-3eea-42e7-af3c-b243f498ad31-catalog-content\") pod \"935c03c1-3eea-42e7-af3c-b243f498ad31\" (UID: \"935c03c1-3eea-42e7-af3c-b243f498ad31\") " Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.136870 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/935c03c1-3eea-42e7-af3c-b243f498ad31-utilities" (OuterVolumeSpecName: "utilities") pod "935c03c1-3eea-42e7-af3c-b243f498ad31" (UID: "935c03c1-3eea-42e7-af3c-b243f498ad31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.148495 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z757z\" (UniqueName: \"kubernetes.io/projected/935c03c1-3eea-42e7-af3c-b243f498ad31-kube-api-access-z757z\") pod \"935c03c1-3eea-42e7-af3c-b243f498ad31\" (UID: \"935c03c1-3eea-42e7-af3c-b243f498ad31\") " Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.149175 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/935c03c1-3eea-42e7-af3c-b243f498ad31-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.166555 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/935c03c1-3eea-42e7-af3c-b243f498ad31-kube-api-access-z757z" (OuterVolumeSpecName: "kube-api-access-z757z") pod "935c03c1-3eea-42e7-af3c-b243f498ad31" (UID: "935c03c1-3eea-42e7-af3c-b243f498ad31"). InnerVolumeSpecName "kube-api-access-z757z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.210626 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/935c03c1-3eea-42e7-af3c-b243f498ad31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "935c03c1-3eea-42e7-af3c-b243f498ad31" (UID: "935c03c1-3eea-42e7-af3c-b243f498ad31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.250646 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/935c03c1-3eea-42e7-af3c-b243f498ad31-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.250902 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z757z\" (UniqueName: \"kubernetes.io/projected/935c03c1-3eea-42e7-af3c-b243f498ad31-kube-api-access-z757z\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.579186 4744 generic.go:334] "Generic (PLEG): container finished" podID="935c03c1-3eea-42e7-af3c-b243f498ad31" containerID="ef0fdbb364240728020829f7418cbc40bef6e0d269dfbe671eb5f6f12faa59d8" exitCode=0 Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.579225 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrgd8" event={"ID":"935c03c1-3eea-42e7-af3c-b243f498ad31","Type":"ContainerDied","Data":"ef0fdbb364240728020829f7418cbc40bef6e0d269dfbe671eb5f6f12faa59d8"} Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.579251 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrgd8" event={"ID":"935c03c1-3eea-42e7-af3c-b243f498ad31","Type":"ContainerDied","Data":"c4f51dce2eba0fef8f559c5b61c5f5f8377750cc18553e3f41b1118c9f07a1c3"} Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.579267 4744 scope.go:117] "RemoveContainer" containerID="ef0fdbb364240728020829f7418cbc40bef6e0d269dfbe671eb5f6f12faa59d8" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.579389 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrgd8" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.622679 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lrgd8"] Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.632630 4744 scope.go:117] "RemoveContainer" containerID="f3741ca1e87c6f491969a5a49b6fe359f2ed343e52072d64866e1123e01bf2d9" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.635742 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lrgd8"] Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.659402 4744 scope.go:117] "RemoveContainer" containerID="ad111da2ae58170a878586f39ddfa3abf5c9f552de191a353d91180b09a27602" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.691075 4744 scope.go:117] "RemoveContainer" containerID="ef0fdbb364240728020829f7418cbc40bef6e0d269dfbe671eb5f6f12faa59d8" Dec 05 20:36:44 crc kubenswrapper[4744]: E1205 20:36:44.693231 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef0fdbb364240728020829f7418cbc40bef6e0d269dfbe671eb5f6f12faa59d8\": container with ID starting with ef0fdbb364240728020829f7418cbc40bef6e0d269dfbe671eb5f6f12faa59d8 not found: ID does not exist" containerID="ef0fdbb364240728020829f7418cbc40bef6e0d269dfbe671eb5f6f12faa59d8" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.693277 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef0fdbb364240728020829f7418cbc40bef6e0d269dfbe671eb5f6f12faa59d8"} err="failed to get container status \"ef0fdbb364240728020829f7418cbc40bef6e0d269dfbe671eb5f6f12faa59d8\": rpc error: code = NotFound desc = could not find container \"ef0fdbb364240728020829f7418cbc40bef6e0d269dfbe671eb5f6f12faa59d8\": container with ID starting with ef0fdbb364240728020829f7418cbc40bef6e0d269dfbe671eb5f6f12faa59d8 not found: ID does not exist" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.693328 4744 scope.go:117] "RemoveContainer" containerID="f3741ca1e87c6f491969a5a49b6fe359f2ed343e52072d64866e1123e01bf2d9" Dec 05 20:36:44 crc kubenswrapper[4744]: E1205 20:36:44.696670 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3741ca1e87c6f491969a5a49b6fe359f2ed343e52072d64866e1123e01bf2d9\": container with ID starting with f3741ca1e87c6f491969a5a49b6fe359f2ed343e52072d64866e1123e01bf2d9 not found: ID does not exist" containerID="f3741ca1e87c6f491969a5a49b6fe359f2ed343e52072d64866e1123e01bf2d9" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.696703 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3741ca1e87c6f491969a5a49b6fe359f2ed343e52072d64866e1123e01bf2d9"} err="failed to get container status \"f3741ca1e87c6f491969a5a49b6fe359f2ed343e52072d64866e1123e01bf2d9\": rpc error: code = NotFound desc = could not find container \"f3741ca1e87c6f491969a5a49b6fe359f2ed343e52072d64866e1123e01bf2d9\": container with ID starting with f3741ca1e87c6f491969a5a49b6fe359f2ed343e52072d64866e1123e01bf2d9 not found: ID does not exist" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.696727 4744 scope.go:117] "RemoveContainer" containerID="ad111da2ae58170a878586f39ddfa3abf5c9f552de191a353d91180b09a27602" Dec 05 20:36:44 crc kubenswrapper[4744]: E1205 20:36:44.697015 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad111da2ae58170a878586f39ddfa3abf5c9f552de191a353d91180b09a27602\": container with ID starting with ad111da2ae58170a878586f39ddfa3abf5c9f552de191a353d91180b09a27602 not found: ID does not exist" containerID="ad111da2ae58170a878586f39ddfa3abf5c9f552de191a353d91180b09a27602" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.697036 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad111da2ae58170a878586f39ddfa3abf5c9f552de191a353d91180b09a27602"} err="failed to get container status \"ad111da2ae58170a878586f39ddfa3abf5c9f552de191a353d91180b09a27602\": rpc error: code = NotFound desc = could not find container \"ad111da2ae58170a878586f39ddfa3abf5c9f552de191a353d91180b09a27602\": container with ID starting with ad111da2ae58170a878586f39ddfa3abf5c9f552de191a353d91180b09a27602 not found: ID does not exist" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.833452 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/memcached-0" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.976996 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-544f89f8d4-qfdqn"] Dec 05 20:36:44 crc kubenswrapper[4744]: E1205 20:36:44.977393 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d" containerName="keystone-bootstrap" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.977417 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d" containerName="keystone-bootstrap" Dec 05 20:36:44 crc kubenswrapper[4744]: E1205 20:36:44.977448 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935c03c1-3eea-42e7-af3c-b243f498ad31" containerName="extract-content" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.977457 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="935c03c1-3eea-42e7-af3c-b243f498ad31" containerName="extract-content" Dec 05 20:36:44 crc kubenswrapper[4744]: E1205 20:36:44.977475 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935c03c1-3eea-42e7-af3c-b243f498ad31" containerName="extract-utilities" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.977483 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="935c03c1-3eea-42e7-af3c-b243f498ad31" containerName="extract-utilities" Dec 05 20:36:44 crc kubenswrapper[4744]: E1205 20:36:44.977493 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935c03c1-3eea-42e7-af3c-b243f498ad31" containerName="registry-server" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.977499 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="935c03c1-3eea-42e7-af3c-b243f498ad31" containerName="registry-server" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.977644 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="935c03c1-3eea-42e7-af3c-b243f498ad31" containerName="registry-server" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.977661 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d" containerName="keystone-bootstrap" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.978237 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:44 crc kubenswrapper[4744]: I1205 20:36:44.995993 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-544f89f8d4-qfdqn"] Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.063210 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-public-tls-certs\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.063344 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-internal-tls-certs\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.063396 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-combined-ca-bundle\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.063513 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-credential-keys\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.063560 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-fernet-keys\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.063588 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-config-data\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.063628 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-scripts\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.063649 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-cert-memcached-mtls\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.063777 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjt5h\" (UniqueName: \"kubernetes.io/projected/30b134cc-5016-44bf-9d8a-b26e23b38782-kube-api-access-bjt5h\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.165383 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjt5h\" (UniqueName: \"kubernetes.io/projected/30b134cc-5016-44bf-9d8a-b26e23b38782-kube-api-access-bjt5h\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.165436 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-public-tls-certs\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.165493 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-internal-tls-certs\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.165520 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-combined-ca-bundle\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.165552 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-credential-keys\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.165577 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-fernet-keys\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.165600 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-config-data\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.165625 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-scripts\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.165646 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-cert-memcached-mtls\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.170380 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-internal-tls-certs\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.170721 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-combined-ca-bundle\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.171032 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-credential-keys\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.171243 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-fernet-keys\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.172663 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-config-data\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.173972 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-scripts\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.175722 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-cert-memcached-mtls\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.184719 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30b134cc-5016-44bf-9d8a-b26e23b38782-public-tls-certs\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.187630 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjt5h\" (UniqueName: \"kubernetes.io/projected/30b134cc-5016-44bf-9d8a-b26e23b38782-kube-api-access-bjt5h\") pod \"keystone-544f89f8d4-qfdqn\" (UID: \"30b134cc-5016-44bf-9d8a-b26e23b38782\") " pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.295370 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.785382 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-544f89f8d4-qfdqn"] Dec 05 20:36:45 crc kubenswrapper[4744]: W1205 20:36:45.789780 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30b134cc_5016_44bf_9d8a_b26e23b38782.slice/crio-f20aa6739ea77ac6a426096830007d424e64b5d55e48d36998ffae550f1b1dd7 WatchSource:0}: Error finding container f20aa6739ea77ac6a426096830007d424e64b5d55e48d36998ffae550f1b1dd7: Status 404 returned error can't find the container with id f20aa6739ea77ac6a426096830007d424e64b5d55e48d36998ffae550f1b1dd7 Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.861690 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:45 crc kubenswrapper[4744]: I1205 20:36:45.871639 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:46 crc kubenswrapper[4744]: I1205 20:36:46.092698 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="935c03c1-3eea-42e7-af3c-b243f498ad31" path="/var/lib/kubelet/pods/935c03c1-3eea-42e7-af3c-b243f498ad31/volumes" Dec 05 20:36:46 crc kubenswrapper[4744]: I1205 20:36:46.600920 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" event={"ID":"30b134cc-5016-44bf-9d8a-b26e23b38782","Type":"ContainerStarted","Data":"ad530b2158f1d8b632bb021118cac163d7caf0ca2ecaa9bb828bd7f64f0de375"} Dec 05 20:36:46 crc kubenswrapper[4744]: I1205 20:36:46.601235 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" event={"ID":"30b134cc-5016-44bf-9d8a-b26e23b38782","Type":"ContainerStarted","Data":"f20aa6739ea77ac6a426096830007d424e64b5d55e48d36998ffae550f1b1dd7"} Dec 05 20:36:46 crc kubenswrapper[4744]: I1205 20:36:46.601274 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:36:46 crc kubenswrapper[4744]: I1205 20:36:46.609770 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:46 crc kubenswrapper[4744]: I1205 20:36:46.629195 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" podStartSLOduration=2.62916807 podStartE2EDuration="2.62916807s" podCreationTimestamp="2025-12-05 20:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:36:46.618125759 +0000 UTC m=+1576.847937147" watchObservedRunningTime="2025-12-05 20:36:46.62916807 +0000 UTC m=+1576.858979448" Dec 05 20:36:46 crc kubenswrapper[4744]: I1205 20:36:46.731384 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:36:46 crc kubenswrapper[4744]: I1205 20:36:46.814657 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:46 crc kubenswrapper[4744]: I1205 20:36:46.841190 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:47 crc kubenswrapper[4744]: I1205 20:36:47.641064 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:36:47 crc kubenswrapper[4744]: I1205 20:36:47.898108 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:47 crc kubenswrapper[4744]: I1205 20:36:47.922726 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:48 crc kubenswrapper[4744]: I1205 20:36:48.618855 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="919b5c2d-000e-4fb0-ae17-7e4258cf323c" containerName="watcher-kuttl-api-log" containerID="cri-o://b8ffc87dc94b10eda35da98b36180382371ef3aaa70e5cc582771657e126b292" gracePeriod=30 Dec 05 20:36:48 crc kubenswrapper[4744]: I1205 20:36:48.619342 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:48 crc kubenswrapper[4744]: I1205 20:36:48.619409 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="919b5c2d-000e-4fb0-ae17-7e4258cf323c" containerName="watcher-api" containerID="cri-o://977ba308d2c1e702ea95ffd070803eb9e359b7c97e6097b6c96e9a3be0288aaf" gracePeriod=30 Dec 05 20:36:48 crc kubenswrapper[4744]: I1205 20:36:48.680870 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:36:48 crc kubenswrapper[4744]: E1205 20:36:48.815180 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod919b5c2d_000e_4fb0_ae17_7e4258cf323c.slice/crio-b8ffc87dc94b10eda35da98b36180382371ef3aaa70e5cc582771657e126b292.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod919b5c2d_000e_4fb0_ae17_7e4258cf323c.slice/crio-conmon-b8ffc87dc94b10eda35da98b36180382371ef3aaa70e5cc582771657e126b292.scope\": RecentStats: unable to find data in memory cache]" Dec 05 20:36:49 crc kubenswrapper[4744]: I1205 20:36:49.628716 4744 generic.go:334] "Generic (PLEG): container finished" podID="919b5c2d-000e-4fb0-ae17-7e4258cf323c" containerID="977ba308d2c1e702ea95ffd070803eb9e359b7c97e6097b6c96e9a3be0288aaf" exitCode=0 Dec 05 20:36:49 crc kubenswrapper[4744]: I1205 20:36:49.629032 4744 generic.go:334] "Generic (PLEG): container finished" podID="919b5c2d-000e-4fb0-ae17-7e4258cf323c" containerID="b8ffc87dc94b10eda35da98b36180382371ef3aaa70e5cc582771657e126b292" exitCode=143 Dec 05 20:36:49 crc kubenswrapper[4744]: I1205 20:36:49.628796 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"919b5c2d-000e-4fb0-ae17-7e4258cf323c","Type":"ContainerDied","Data":"977ba308d2c1e702ea95ffd070803eb9e359b7c97e6097b6c96e9a3be0288aaf"} Dec 05 20:36:49 crc kubenswrapper[4744]: I1205 20:36:49.629358 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"919b5c2d-000e-4fb0-ae17-7e4258cf323c","Type":"ContainerDied","Data":"b8ffc87dc94b10eda35da98b36180382371ef3aaa70e5cc582771657e126b292"} Dec 05 20:36:49 crc kubenswrapper[4744]: I1205 20:36:49.919486 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:49 crc kubenswrapper[4744]: I1205 20:36:49.965150 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5qhq\" (UniqueName: \"kubernetes.io/projected/919b5c2d-000e-4fb0-ae17-7e4258cf323c-kube-api-access-h5qhq\") pod \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " Dec 05 20:36:49 crc kubenswrapper[4744]: I1205 20:36:49.965202 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-config-data\") pod \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " Dec 05 20:36:49 crc kubenswrapper[4744]: I1205 20:36:49.965337 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-internal-tls-certs\") pod \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " Dec 05 20:36:49 crc kubenswrapper[4744]: I1205 20:36:49.965361 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-combined-ca-bundle\") pod \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " Dec 05 20:36:49 crc kubenswrapper[4744]: I1205 20:36:49.965392 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-custom-prometheus-ca\") pod \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " Dec 05 20:36:49 crc kubenswrapper[4744]: I1205 20:36:49.965448 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/919b5c2d-000e-4fb0-ae17-7e4258cf323c-logs\") pod \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " Dec 05 20:36:49 crc kubenswrapper[4744]: I1205 20:36:49.965512 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-public-tls-certs\") pod \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " Dec 05 20:36:49 crc kubenswrapper[4744]: I1205 20:36:49.965531 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-cert-memcached-mtls\") pod \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\" (UID: \"919b5c2d-000e-4fb0-ae17-7e4258cf323c\") " Dec 05 20:36:49 crc kubenswrapper[4744]: I1205 20:36:49.968661 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919b5c2d-000e-4fb0-ae17-7e4258cf323c-logs" (OuterVolumeSpecName: "logs") pod "919b5c2d-000e-4fb0-ae17-7e4258cf323c" (UID: "919b5c2d-000e-4fb0-ae17-7e4258cf323c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:36:49 crc kubenswrapper[4744]: I1205 20:36:49.987568 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919b5c2d-000e-4fb0-ae17-7e4258cf323c-kube-api-access-h5qhq" (OuterVolumeSpecName: "kube-api-access-h5qhq") pod "919b5c2d-000e-4fb0-ae17-7e4258cf323c" (UID: "919b5c2d-000e-4fb0-ae17-7e4258cf323c"). InnerVolumeSpecName "kube-api-access-h5qhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.017399 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "919b5c2d-000e-4fb0-ae17-7e4258cf323c" (UID: "919b5c2d-000e-4fb0-ae17-7e4258cf323c"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.021440 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "919b5c2d-000e-4fb0-ae17-7e4258cf323c" (UID: "919b5c2d-000e-4fb0-ae17-7e4258cf323c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.042714 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "919b5c2d-000e-4fb0-ae17-7e4258cf323c" (UID: "919b5c2d-000e-4fb0-ae17-7e4258cf323c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.054422 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "919b5c2d-000e-4fb0-ae17-7e4258cf323c" (UID: "919b5c2d-000e-4fb0-ae17-7e4258cf323c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.060580 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-config-data" (OuterVolumeSpecName: "config-data") pod "919b5c2d-000e-4fb0-ae17-7e4258cf323c" (UID: "919b5c2d-000e-4fb0-ae17-7e4258cf323c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.067159 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.067185 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/919b5c2d-000e-4fb0-ae17-7e4258cf323c-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.067195 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.067204 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5qhq\" (UniqueName: \"kubernetes.io/projected/919b5c2d-000e-4fb0-ae17-7e4258cf323c-kube-api-access-h5qhq\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.067215 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.067224 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.067232 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.089717 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "919b5c2d-000e-4fb0-ae17-7e4258cf323c" (UID: "919b5c2d-000e-4fb0-ae17-7e4258cf323c"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.169537 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/919b5c2d-000e-4fb0-ae17-7e4258cf323c-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.642672 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.642672 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"919b5c2d-000e-4fb0-ae17-7e4258cf323c","Type":"ContainerDied","Data":"2d6c2d40b298b56db2806947b687b8da0faa9dfed60034632093c3d986190190"} Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.642760 4744 scope.go:117] "RemoveContainer" containerID="977ba308d2c1e702ea95ffd070803eb9e359b7c97e6097b6c96e9a3be0288aaf" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.679162 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.686954 4744 scope.go:117] "RemoveContainer" containerID="b8ffc87dc94b10eda35da98b36180382371ef3aaa70e5cc582771657e126b292" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.710455 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.718637 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:36:50 crc kubenswrapper[4744]: E1205 20:36:50.719098 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919b5c2d-000e-4fb0-ae17-7e4258cf323c" containerName="watcher-api" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.719123 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="919b5c2d-000e-4fb0-ae17-7e4258cf323c" containerName="watcher-api" Dec 05 20:36:50 crc kubenswrapper[4744]: E1205 20:36:50.719144 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919b5c2d-000e-4fb0-ae17-7e4258cf323c" containerName="watcher-kuttl-api-log" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.719153 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="919b5c2d-000e-4fb0-ae17-7e4258cf323c" containerName="watcher-kuttl-api-log" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.719510 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="919b5c2d-000e-4fb0-ae17-7e4258cf323c" containerName="watcher-kuttl-api-log" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.719584 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="919b5c2d-000e-4fb0-ae17-7e4258cf323c" containerName="watcher-api" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.722475 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.725368 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.732693 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.779087 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4732639-fc4c-4637-858b-c343ddcfa41e-logs\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.779395 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.779527 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.779673 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xps49\" (UniqueName: \"kubernetes.io/projected/a4732639-fc4c-4637-858b-c343ddcfa41e-kube-api-access-xps49\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.779891 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.780007 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.881725 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xps49\" (UniqueName: \"kubernetes.io/projected/a4732639-fc4c-4637-858b-c343ddcfa41e-kube-api-access-xps49\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.882045 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.882151 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.882342 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4732639-fc4c-4637-858b-c343ddcfa41e-logs\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.882445 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.882544 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.883443 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4732639-fc4c-4637-858b-c343ddcfa41e-logs\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.887080 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.887085 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.887182 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.887244 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:50 crc kubenswrapper[4744]: I1205 20:36:50.902371 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xps49\" (UniqueName: \"kubernetes.io/projected/a4732639-fc4c-4637-858b-c343ddcfa41e-kube-api-access-xps49\") pod \"watcher-kuttl-api-0\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:51 crc kubenswrapper[4744]: I1205 20:36:51.048146 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:51 crc kubenswrapper[4744]: I1205 20:36:51.535159 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:36:51 crc kubenswrapper[4744]: I1205 20:36:51.649826 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"a4732639-fc4c-4637-858b-c343ddcfa41e","Type":"ContainerStarted","Data":"74eda62d14c7d438bcfde80a14e75816b9c538768e17c09d208b2c2ac0bc90ed"} Dec 05 20:36:52 crc kubenswrapper[4744]: I1205 20:36:52.091123 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="919b5c2d-000e-4fb0-ae17-7e4258cf323c" path="/var/lib/kubelet/pods/919b5c2d-000e-4fb0-ae17-7e4258cf323c/volumes" Dec 05 20:36:52 crc kubenswrapper[4744]: I1205 20:36:52.667633 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"a4732639-fc4c-4637-858b-c343ddcfa41e","Type":"ContainerStarted","Data":"c418d9e285b4f0257143aa336232c4f3f6c49fd91d401fe502cdcf154341c447"} Dec 05 20:36:52 crc kubenswrapper[4744]: I1205 20:36:52.667693 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"a4732639-fc4c-4637-858b-c343ddcfa41e","Type":"ContainerStarted","Data":"c250032ef55cc455ef8aeca089af586a7c01e7a787597c6053aa1a85cd02d841"} Dec 05 20:36:52 crc kubenswrapper[4744]: I1205 20:36:52.671099 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:52 crc kubenswrapper[4744]: I1205 20:36:52.708736 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.708701583 podStartE2EDuration="2.708701583s" podCreationTimestamp="2025-12-05 20:36:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:36:52.695726845 +0000 UTC m=+1582.925538253" watchObservedRunningTime="2025-12-05 20:36:52.708701583 +0000 UTC m=+1582.938512991" Dec 05 20:36:54 crc kubenswrapper[4744]: I1205 20:36:54.684103 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:36:54 crc kubenswrapper[4744]: I1205 20:36:54.862456 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:56 crc kubenswrapper[4744]: I1205 20:36:56.048452 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:36:57 crc kubenswrapper[4744]: I1205 20:36:57.896865 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:01 crc kubenswrapper[4744]: I1205 20:37:01.049932 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:37:01 crc kubenswrapper[4744]: I1205 20:37:01.055862 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:37:01 crc kubenswrapper[4744]: I1205 20:37:01.758511 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:37:16 crc kubenswrapper[4744]: I1205 20:37:16.766986 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/keystone-544f89f8d4-qfdqn" Dec 05 20:37:16 crc kubenswrapper[4744]: I1205 20:37:16.865131 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6"] Dec 05 20:37:16 crc kubenswrapper[4744]: I1205 20:37:16.865351 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" podUID="0474bea5-5db3-4b16-a280-9589048721c1" containerName="keystone-api" containerID="cri-o://c8c8ab4fb5df1a29dc62ce500c5591c1c3c2bc5c26ce0b93df59e3b4e025ce6a" gracePeriod=30 Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.404344 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.467922 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-config-data\") pod \"0474bea5-5db3-4b16-a280-9589048721c1\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.468031 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-fernet-keys\") pod \"0474bea5-5db3-4b16-a280-9589048721c1\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.468366 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-internal-tls-certs\") pod \"0474bea5-5db3-4b16-a280-9589048721c1\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.468434 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-public-tls-certs\") pod \"0474bea5-5db3-4b16-a280-9589048721c1\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.468454 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-combined-ca-bundle\") pod \"0474bea5-5db3-4b16-a280-9589048721c1\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.468509 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-scripts\") pod \"0474bea5-5db3-4b16-a280-9589048721c1\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.468535 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-credential-keys\") pod \"0474bea5-5db3-4b16-a280-9589048721c1\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.468577 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mmcj\" (UniqueName: \"kubernetes.io/projected/0474bea5-5db3-4b16-a280-9589048721c1-kube-api-access-2mmcj\") pod \"0474bea5-5db3-4b16-a280-9589048721c1\" (UID: \"0474bea5-5db3-4b16-a280-9589048721c1\") " Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.474563 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-scripts" (OuterVolumeSpecName: "scripts") pod "0474bea5-5db3-4b16-a280-9589048721c1" (UID: "0474bea5-5db3-4b16-a280-9589048721c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.474599 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0474bea5-5db3-4b16-a280-9589048721c1-kube-api-access-2mmcj" (OuterVolumeSpecName: "kube-api-access-2mmcj") pod "0474bea5-5db3-4b16-a280-9589048721c1" (UID: "0474bea5-5db3-4b16-a280-9589048721c1"). InnerVolumeSpecName "kube-api-access-2mmcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.474645 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0474bea5-5db3-4b16-a280-9589048721c1" (UID: "0474bea5-5db3-4b16-a280-9589048721c1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.479661 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0474bea5-5db3-4b16-a280-9589048721c1" (UID: "0474bea5-5db3-4b16-a280-9589048721c1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.493966 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-config-data" (OuterVolumeSpecName: "config-data") pod "0474bea5-5db3-4b16-a280-9589048721c1" (UID: "0474bea5-5db3-4b16-a280-9589048721c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.521485 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0474bea5-5db3-4b16-a280-9589048721c1" (UID: "0474bea5-5db3-4b16-a280-9589048721c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.543162 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0474bea5-5db3-4b16-a280-9589048721c1" (UID: "0474bea5-5db3-4b16-a280-9589048721c1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.557369 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0474bea5-5db3-4b16-a280-9589048721c1" (UID: "0474bea5-5db3-4b16-a280-9589048721c1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.570643 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.570697 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.570710 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.570723 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.570736 4744 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.570747 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mmcj\" (UniqueName: \"kubernetes.io/projected/0474bea5-5db3-4b16-a280-9589048721c1-kube-api-access-2mmcj\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.570759 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.570770 4744 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0474bea5-5db3-4b16-a280-9589048721c1-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.946158 4744 generic.go:334] "Generic (PLEG): container finished" podID="0474bea5-5db3-4b16-a280-9589048721c1" containerID="c8c8ab4fb5df1a29dc62ce500c5591c1c3c2bc5c26ce0b93df59e3b4e025ce6a" exitCode=0 Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.946206 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" event={"ID":"0474bea5-5db3-4b16-a280-9589048721c1","Type":"ContainerDied","Data":"c8c8ab4fb5df1a29dc62ce500c5591c1c3c2bc5c26ce0b93df59e3b4e025ce6a"} Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.946225 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.946253 4744 scope.go:117] "RemoveContainer" containerID="c8c8ab4fb5df1a29dc62ce500c5591c1c3c2bc5c26ce0b93df59e3b4e025ce6a" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.946240 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6" event={"ID":"0474bea5-5db3-4b16-a280-9589048721c1","Type":"ContainerDied","Data":"675f9cdff4adaf73cf7bda6f8bcb043fa2f456dc8f97d909a44c342d3578d5d7"} Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.982887 4744 scope.go:117] "RemoveContainer" containerID="c8c8ab4fb5df1a29dc62ce500c5591c1c3c2bc5c26ce0b93df59e3b4e025ce6a" Dec 05 20:37:20 crc kubenswrapper[4744]: E1205 20:37:20.983377 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8c8ab4fb5df1a29dc62ce500c5591c1c3c2bc5c26ce0b93df59e3b4e025ce6a\": container with ID starting with c8c8ab4fb5df1a29dc62ce500c5591c1c3c2bc5c26ce0b93df59e3b4e025ce6a not found: ID does not exist" containerID="c8c8ab4fb5df1a29dc62ce500c5591c1c3c2bc5c26ce0b93df59e3b4e025ce6a" Dec 05 20:37:20 crc kubenswrapper[4744]: I1205 20:37:20.983406 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8c8ab4fb5df1a29dc62ce500c5591c1c3c2bc5c26ce0b93df59e3b4e025ce6a"} err="failed to get container status \"c8c8ab4fb5df1a29dc62ce500c5591c1c3c2bc5c26ce0b93df59e3b4e025ce6a\": rpc error: code = NotFound desc = could not find container \"c8c8ab4fb5df1a29dc62ce500c5591c1c3c2bc5c26ce0b93df59e3b4e025ce6a\": container with ID starting with c8c8ab4fb5df1a29dc62ce500c5591c1c3c2bc5c26ce0b93df59e3b4e025ce6a not found: ID does not exist" Dec 05 20:37:21 crc kubenswrapper[4744]: I1205 20:37:21.001586 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6"] Dec 05 20:37:21 crc kubenswrapper[4744]: I1205 20:37:21.010394 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-6ddbdbb77d-2mgh6"] Dec 05 20:37:22 crc kubenswrapper[4744]: I1205 20:37:22.105840 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0474bea5-5db3-4b16-a280-9589048721c1" path="/var/lib/kubelet/pods/0474bea5-5db3-4b16-a280-9589048721c1/volumes" Dec 05 20:37:23 crc kubenswrapper[4744]: I1205 20:37:23.989681 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:37:23 crc kubenswrapper[4744]: I1205 20:37:23.990276 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="ceilometer-central-agent" containerID="cri-o://8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536" gracePeriod=30 Dec 05 20:37:23 crc kubenswrapper[4744]: I1205 20:37:23.990332 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="sg-core" containerID="cri-o://a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3" gracePeriod=30 Dec 05 20:37:23 crc kubenswrapper[4744]: I1205 20:37:23.990398 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="proxy-httpd" containerID="cri-o://497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b" gracePeriod=30 Dec 05 20:37:23 crc kubenswrapper[4744]: I1205 20:37:23.990403 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="ceilometer-notification-agent" containerID="cri-o://25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789" gracePeriod=30 Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.822662 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.946491 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6ljp\" (UniqueName: \"kubernetes.io/projected/2016fad5-7df3-474b-8322-7f8a81811556-kube-api-access-p6ljp\") pod \"2016fad5-7df3-474b-8322-7f8a81811556\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.946586 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2016fad5-7df3-474b-8322-7f8a81811556-log-httpd\") pod \"2016fad5-7df3-474b-8322-7f8a81811556\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.946633 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2016fad5-7df3-474b-8322-7f8a81811556-run-httpd\") pod \"2016fad5-7df3-474b-8322-7f8a81811556\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.947251 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2016fad5-7df3-474b-8322-7f8a81811556-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2016fad5-7df3-474b-8322-7f8a81811556" (UID: "2016fad5-7df3-474b-8322-7f8a81811556"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.947425 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2016fad5-7df3-474b-8322-7f8a81811556-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2016fad5-7df3-474b-8322-7f8a81811556" (UID: "2016fad5-7df3-474b-8322-7f8a81811556"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.946671 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-ceilometer-tls-certs\") pod \"2016fad5-7df3-474b-8322-7f8a81811556\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.947571 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-sg-core-conf-yaml\") pod \"2016fad5-7df3-474b-8322-7f8a81811556\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.947597 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-combined-ca-bundle\") pod \"2016fad5-7df3-474b-8322-7f8a81811556\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.947920 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-scripts\") pod \"2016fad5-7df3-474b-8322-7f8a81811556\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.947948 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-config-data\") pod \"2016fad5-7df3-474b-8322-7f8a81811556\" (UID: \"2016fad5-7df3-474b-8322-7f8a81811556\") " Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.948338 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2016fad5-7df3-474b-8322-7f8a81811556-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.948360 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2016fad5-7df3-474b-8322-7f8a81811556-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.953884 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2016fad5-7df3-474b-8322-7f8a81811556-kube-api-access-p6ljp" (OuterVolumeSpecName: "kube-api-access-p6ljp") pod "2016fad5-7df3-474b-8322-7f8a81811556" (UID: "2016fad5-7df3-474b-8322-7f8a81811556"). InnerVolumeSpecName "kube-api-access-p6ljp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.956281 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-scripts" (OuterVolumeSpecName: "scripts") pod "2016fad5-7df3-474b-8322-7f8a81811556" (UID: "2016fad5-7df3-474b-8322-7f8a81811556"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:24 crc kubenswrapper[4744]: I1205 20:37:24.983276 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2016fad5-7df3-474b-8322-7f8a81811556" (UID: "2016fad5-7df3-474b-8322-7f8a81811556"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.001486 4744 generic.go:334] "Generic (PLEG): container finished" podID="2016fad5-7df3-474b-8322-7f8a81811556" containerID="497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b" exitCode=0 Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.002070 4744 generic.go:334] "Generic (PLEG): container finished" podID="2016fad5-7df3-474b-8322-7f8a81811556" containerID="a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3" exitCode=2 Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.002143 4744 generic.go:334] "Generic (PLEG): container finished" podID="2016fad5-7df3-474b-8322-7f8a81811556" containerID="25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789" exitCode=0 Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.002203 4744 generic.go:334] "Generic (PLEG): container finished" podID="2016fad5-7df3-474b-8322-7f8a81811556" containerID="8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536" exitCode=0 Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.001548 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2016fad5-7df3-474b-8322-7f8a81811556","Type":"ContainerDied","Data":"497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b"} Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.002378 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2016fad5-7df3-474b-8322-7f8a81811556","Type":"ContainerDied","Data":"a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3"} Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.002466 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2016fad5-7df3-474b-8322-7f8a81811556","Type":"ContainerDied","Data":"25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789"} Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.002544 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2016fad5-7df3-474b-8322-7f8a81811556","Type":"ContainerDied","Data":"8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536"} Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.002646 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2016fad5-7df3-474b-8322-7f8a81811556","Type":"ContainerDied","Data":"07eaa7bea41a9a6abb8d228c952c6edf05666335dcb11aa9998f7549ece3075f"} Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.002601 4744 scope.go:117] "RemoveContainer" containerID="497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.001693 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.037023 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2016fad5-7df3-474b-8322-7f8a81811556" (UID: "2016fad5-7df3-474b-8322-7f8a81811556"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.039868 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2016fad5-7df3-474b-8322-7f8a81811556" (UID: "2016fad5-7df3-474b-8322-7f8a81811556"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.043750 4744 scope.go:117] "RemoveContainer" containerID="a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.050052 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.050256 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.050391 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.050482 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6ljp\" (UniqueName: \"kubernetes.io/projected/2016fad5-7df3-474b-8322-7f8a81811556-kube-api-access-p6ljp\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.050564 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.067487 4744 scope.go:117] "RemoveContainer" containerID="25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.070816 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-config-data" (OuterVolumeSpecName: "config-data") pod "2016fad5-7df3-474b-8322-7f8a81811556" (UID: "2016fad5-7df3-474b-8322-7f8a81811556"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.090226 4744 scope.go:117] "RemoveContainer" containerID="8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.152263 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2016fad5-7df3-474b-8322-7f8a81811556-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.153129 4744 scope.go:117] "RemoveContainer" containerID="497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b" Dec 05 20:37:25 crc kubenswrapper[4744]: E1205 20:37:25.153521 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b\": container with ID starting with 497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b not found: ID does not exist" containerID="497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.153569 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b"} err="failed to get container status \"497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b\": rpc error: code = NotFound desc = could not find container \"497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b\": container with ID starting with 497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.153600 4744 scope.go:117] "RemoveContainer" containerID="a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3" Dec 05 20:37:25 crc kubenswrapper[4744]: E1205 20:37:25.153966 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3\": container with ID starting with a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3 not found: ID does not exist" containerID="a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.153995 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3"} err="failed to get container status \"a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3\": rpc error: code = NotFound desc = could not find container \"a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3\": container with ID starting with a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3 not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.154019 4744 scope.go:117] "RemoveContainer" containerID="25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789" Dec 05 20:37:25 crc kubenswrapper[4744]: E1205 20:37:25.154209 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789\": container with ID starting with 25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789 not found: ID does not exist" containerID="25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.154234 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789"} err="failed to get container status \"25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789\": rpc error: code = NotFound desc = could not find container \"25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789\": container with ID starting with 25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789 not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.154250 4744 scope.go:117] "RemoveContainer" containerID="8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536" Dec 05 20:37:25 crc kubenswrapper[4744]: E1205 20:37:25.154604 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536\": container with ID starting with 8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536 not found: ID does not exist" containerID="8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.154641 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536"} err="failed to get container status \"8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536\": rpc error: code = NotFound desc = could not find container \"8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536\": container with ID starting with 8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536 not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.154654 4744 scope.go:117] "RemoveContainer" containerID="497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.154845 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b"} err="failed to get container status \"497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b\": rpc error: code = NotFound desc = could not find container \"497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b\": container with ID starting with 497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.154871 4744 scope.go:117] "RemoveContainer" containerID="a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.155145 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3"} err="failed to get container status \"a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3\": rpc error: code = NotFound desc = could not find container \"a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3\": container with ID starting with a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3 not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.155272 4744 scope.go:117] "RemoveContainer" containerID="25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.155679 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789"} err="failed to get container status \"25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789\": rpc error: code = NotFound desc = could not find container \"25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789\": container with ID starting with 25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789 not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.155707 4744 scope.go:117] "RemoveContainer" containerID="8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.155923 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536"} err="failed to get container status \"8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536\": rpc error: code = NotFound desc = could not find container \"8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536\": container with ID starting with 8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536 not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.155948 4744 scope.go:117] "RemoveContainer" containerID="497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.156139 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b"} err="failed to get container status \"497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b\": rpc error: code = NotFound desc = could not find container \"497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b\": container with ID starting with 497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.156162 4744 scope.go:117] "RemoveContainer" containerID="a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.156386 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3"} err="failed to get container status \"a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3\": rpc error: code = NotFound desc = could not find container \"a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3\": container with ID starting with a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3 not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.156407 4744 scope.go:117] "RemoveContainer" containerID="25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.156592 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789"} err="failed to get container status \"25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789\": rpc error: code = NotFound desc = could not find container \"25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789\": container with ID starting with 25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789 not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.156611 4744 scope.go:117] "RemoveContainer" containerID="8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.156787 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536"} err="failed to get container status \"8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536\": rpc error: code = NotFound desc = could not find container \"8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536\": container with ID starting with 8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536 not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.156896 4744 scope.go:117] "RemoveContainer" containerID="497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.157205 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b"} err="failed to get container status \"497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b\": rpc error: code = NotFound desc = could not find container \"497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b\": container with ID starting with 497244786e4f59d6c634a2c83a8f20fb0bb3b665088c692d56fd4aff75550b2b not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.157238 4744 scope.go:117] "RemoveContainer" containerID="a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.157437 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3"} err="failed to get container status \"a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3\": rpc error: code = NotFound desc = could not find container \"a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3\": container with ID starting with a57077bcd22502309e6b6f0f343fb5fe50d48c411214736e44783cf5eb447df3 not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.157472 4744 scope.go:117] "RemoveContainer" containerID="25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.157658 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789"} err="failed to get container status \"25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789\": rpc error: code = NotFound desc = could not find container \"25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789\": container with ID starting with 25ede7852bbc6568d0239c2feb09ef784d709a824642861972c08eaa3fe16789 not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.157677 4744 scope.go:117] "RemoveContainer" containerID="8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.157835 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536"} err="failed to get container status \"8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536\": rpc error: code = NotFound desc = could not find container \"8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536\": container with ID starting with 8824a89eb44d369357b3b72866639b6229b80284fefb24ab110f26fb4444c536 not found: ID does not exist" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.337336 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.349927 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.361171 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:37:25 crc kubenswrapper[4744]: E1205 20:37:25.361500 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="proxy-httpd" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.361518 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="proxy-httpd" Dec 05 20:37:25 crc kubenswrapper[4744]: E1205 20:37:25.361553 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="ceilometer-central-agent" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.361559 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="ceilometer-central-agent" Dec 05 20:37:25 crc kubenswrapper[4744]: E1205 20:37:25.361569 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="ceilometer-notification-agent" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.361575 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="ceilometer-notification-agent" Dec 05 20:37:25 crc kubenswrapper[4744]: E1205 20:37:25.361582 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="sg-core" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.361589 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="sg-core" Dec 05 20:37:25 crc kubenswrapper[4744]: E1205 20:37:25.361604 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0474bea5-5db3-4b16-a280-9589048721c1" containerName="keystone-api" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.361610 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0474bea5-5db3-4b16-a280-9589048721c1" containerName="keystone-api" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.361751 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="ceilometer-central-agent" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.361764 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="ceilometer-notification-agent" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.361777 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0474bea5-5db3-4b16-a280-9589048721c1" containerName="keystone-api" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.361790 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="sg-core" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.361800 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2016fad5-7df3-474b-8322-7f8a81811556" containerName="proxy-httpd" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.364120 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.366329 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.366536 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.368476 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.384107 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.456539 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-log-httpd\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.456603 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-scripts\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.456623 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-run-httpd\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.456647 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.456663 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.456683 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-config-data\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.456714 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.456772 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkslg\" (UniqueName: \"kubernetes.io/projected/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-kube-api-access-qkslg\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.558235 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-log-httpd\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.558344 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-scripts\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.558373 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-run-httpd\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.558399 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.558426 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.558456 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-config-data\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.558499 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.558561 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkslg\" (UniqueName: \"kubernetes.io/projected/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-kube-api-access-qkslg\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.558999 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-log-httpd\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.559038 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-run-httpd\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.562730 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.564405 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.565042 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-scripts\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.565633 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-config-data\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.569534 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.574593 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkslg\" (UniqueName: \"kubernetes.io/projected/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-kube-api-access-qkslg\") pod \"ceilometer-0\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:25 crc kubenswrapper[4744]: I1205 20:37:25.679077 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:26 crc kubenswrapper[4744]: I1205 20:37:26.089867 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2016fad5-7df3-474b-8322-7f8a81811556" path="/var/lib/kubelet/pods/2016fad5-7df3-474b-8322-7f8a81811556/volumes" Dec 05 20:37:26 crc kubenswrapper[4744]: I1205 20:37:26.201779 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:37:27 crc kubenswrapper[4744]: I1205 20:37:27.021929 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"da229e28-5d19-4f4b-afab-3fd7a27ea9b1","Type":"ContainerStarted","Data":"847d2d82802b98965065846dcd322715cb5a3d4195a42c59cd31636434890ea2"} Dec 05 20:37:27 crc kubenswrapper[4744]: I1205 20:37:27.021981 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"da229e28-5d19-4f4b-afab-3fd7a27ea9b1","Type":"ContainerStarted","Data":"7cf2a579c7ff0875faeecfb51381d003a89f448293284ad2a3b8fc1c748f9b8f"} Dec 05 20:37:28 crc kubenswrapper[4744]: I1205 20:37:28.030749 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"da229e28-5d19-4f4b-afab-3fd7a27ea9b1","Type":"ContainerStarted","Data":"75ab16526fd07710ff9155d2fae7387ee86298df5b61fc95b78298852500770a"} Dec 05 20:37:29 crc kubenswrapper[4744]: I1205 20:37:29.044903 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"da229e28-5d19-4f4b-afab-3fd7a27ea9b1","Type":"ContainerStarted","Data":"e8090a3f9caf693275b07f7c7f3c93bbd3f826265deceea34424a1d3b92f80d4"} Dec 05 20:37:30 crc kubenswrapper[4744]: I1205 20:37:30.068238 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"da229e28-5d19-4f4b-afab-3fd7a27ea9b1","Type":"ContainerStarted","Data":"b09e507612c6998e0072b8f559e69066a5f6be885be20900e3bf86653aa11e04"} Dec 05 20:37:30 crc kubenswrapper[4744]: I1205 20:37:30.069884 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:30 crc kubenswrapper[4744]: I1205 20:37:30.106990 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.9124251920000002 podStartE2EDuration="5.106976291s" podCreationTimestamp="2025-12-05 20:37:25 +0000 UTC" firstStartedPulling="2025-12-05 20:37:26.213926412 +0000 UTC m=+1616.443737780" lastFinishedPulling="2025-12-05 20:37:29.408477501 +0000 UTC m=+1619.638288879" observedRunningTime="2025-12-05 20:37:30.101836206 +0000 UTC m=+1620.331647574" watchObservedRunningTime="2025-12-05 20:37:30.106976291 +0000 UTC m=+1620.336787659" Dec 05 20:37:35 crc kubenswrapper[4744]: E1205 20:37:35.232614 4744 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.51:35718->38.102.83.51:42651: read tcp 38.102.83.51:35718->38.102.83.51:42651: read: connection reset by peer Dec 05 20:37:49 crc kubenswrapper[4744]: I1205 20:37:49.806718 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:37:49 crc kubenswrapper[4744]: I1205 20:37:49.807182 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:37:55 crc kubenswrapper[4744]: I1205 20:37:55.696780 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.343140 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-md64s"] Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.349461 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-md64s"] Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.407803 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher7cee-account-delete-wtflt"] Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.408778 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher7cee-account-delete-wtflt" Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.430726 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher7cee-account-delete-wtflt"] Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.439792 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.440025 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="c02b7d0f-79b3-4d57-8599-b0a077a1747f" containerName="watcher-applier" containerID="cri-o://6f46e9b143e77a5a322e6a7547ecb495adf5875ee3c3fefaf8c84b30e61e1ff5" gracePeriod=30 Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.476652 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.476898 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="a4732639-fc4c-4637-858b-c343ddcfa41e" containerName="watcher-kuttl-api-log" containerID="cri-o://c250032ef55cc455ef8aeca089af586a7c01e7a787597c6053aa1a85cd02d841" gracePeriod=30 Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.477030 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="a4732639-fc4c-4637-858b-c343ddcfa41e" containerName="watcher-api" containerID="cri-o://c418d9e285b4f0257143aa336232c4f3f6c49fd91d401fe502cdcf154341c447" gracePeriod=30 Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.501090 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv9lc\" (UniqueName: \"kubernetes.io/projected/d8c937e6-61c3-49e3-8350-77193a48de84-kube-api-access-nv9lc\") pod \"watcher7cee-account-delete-wtflt\" (UID: \"d8c937e6-61c3-49e3-8350-77193a48de84\") " pod="watcher-kuttl-default/watcher7cee-account-delete-wtflt" Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.501249 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c937e6-61c3-49e3-8350-77193a48de84-operator-scripts\") pod \"watcher7cee-account-delete-wtflt\" (UID: \"d8c937e6-61c3-49e3-8350-77193a48de84\") " pod="watcher-kuttl-default/watcher7cee-account-delete-wtflt" Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.567430 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.567670 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="4978e864-b896-4580-a9cf-796c8d465b8a" containerName="watcher-decision-engine" containerID="cri-o://b5417a2f9d6daa4ff819c2650d9c6bee168a458c4c0ed892c21a7cd9ad34871e" gracePeriod=30 Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.602902 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv9lc\" (UniqueName: \"kubernetes.io/projected/d8c937e6-61c3-49e3-8350-77193a48de84-kube-api-access-nv9lc\") pod \"watcher7cee-account-delete-wtflt\" (UID: \"d8c937e6-61c3-49e3-8350-77193a48de84\") " pod="watcher-kuttl-default/watcher7cee-account-delete-wtflt" Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.602990 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c937e6-61c3-49e3-8350-77193a48de84-operator-scripts\") pod \"watcher7cee-account-delete-wtflt\" (UID: \"d8c937e6-61c3-49e3-8350-77193a48de84\") " pod="watcher-kuttl-default/watcher7cee-account-delete-wtflt" Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.604525 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c937e6-61c3-49e3-8350-77193a48de84-operator-scripts\") pod \"watcher7cee-account-delete-wtflt\" (UID: \"d8c937e6-61c3-49e3-8350-77193a48de84\") " pod="watcher-kuttl-default/watcher7cee-account-delete-wtflt" Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.637919 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv9lc\" (UniqueName: \"kubernetes.io/projected/d8c937e6-61c3-49e3-8350-77193a48de84-kube-api-access-nv9lc\") pod \"watcher7cee-account-delete-wtflt\" (UID: \"d8c937e6-61c3-49e3-8350-77193a48de84\") " pod="watcher-kuttl-default/watcher7cee-account-delete-wtflt" Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.730482 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher7cee-account-delete-wtflt" Dec 05 20:37:56 crc kubenswrapper[4744]: E1205 20:37:56.818185 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6f46e9b143e77a5a322e6a7547ecb495adf5875ee3c3fefaf8c84b30e61e1ff5" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:37:56 crc kubenswrapper[4744]: E1205 20:37:56.820384 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6f46e9b143e77a5a322e6a7547ecb495adf5875ee3c3fefaf8c84b30e61e1ff5" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:37:56 crc kubenswrapper[4744]: E1205 20:37:56.825342 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6f46e9b143e77a5a322e6a7547ecb495adf5875ee3c3fefaf8c84b30e61e1ff5" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:37:56 crc kubenswrapper[4744]: E1205 20:37:56.825391 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="c02b7d0f-79b3-4d57-8599-b0a077a1747f" containerName="watcher-applier" Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.826513 4744 generic.go:334] "Generic (PLEG): container finished" podID="a4732639-fc4c-4637-858b-c343ddcfa41e" containerID="c250032ef55cc455ef8aeca089af586a7c01e7a787597c6053aa1a85cd02d841" exitCode=143 Dec 05 20:37:56 crc kubenswrapper[4744]: I1205 20:37:56.826553 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"a4732639-fc4c-4637-858b-c343ddcfa41e","Type":"ContainerDied","Data":"c250032ef55cc455ef8aeca089af586a7c01e7a787597c6053aa1a85cd02d841"} Dec 05 20:37:57 crc kubenswrapper[4744]: I1205 20:37:57.274453 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher7cee-account-delete-wtflt"] Dec 05 20:37:57 crc kubenswrapper[4744]: I1205 20:37:57.837010 4744 generic.go:334] "Generic (PLEG): container finished" podID="a4732639-fc4c-4637-858b-c343ddcfa41e" containerID="c418d9e285b4f0257143aa336232c4f3f6c49fd91d401fe502cdcf154341c447" exitCode=0 Dec 05 20:37:57 crc kubenswrapper[4744]: I1205 20:37:57.837376 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"a4732639-fc4c-4637-858b-c343ddcfa41e","Type":"ContainerDied","Data":"c418d9e285b4f0257143aa336232c4f3f6c49fd91d401fe502cdcf154341c447"} Dec 05 20:37:57 crc kubenswrapper[4744]: I1205 20:37:57.839097 4744 generic.go:334] "Generic (PLEG): container finished" podID="d8c937e6-61c3-49e3-8350-77193a48de84" containerID="d6af28c30fd182c01e93121217eb4289e991e04ea88e4ebc54e24ac262b456ec" exitCode=0 Dec 05 20:37:57 crc kubenswrapper[4744]: I1205 20:37:57.839137 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher7cee-account-delete-wtflt" event={"ID":"d8c937e6-61c3-49e3-8350-77193a48de84","Type":"ContainerDied","Data":"d6af28c30fd182c01e93121217eb4289e991e04ea88e4ebc54e24ac262b456ec"} Dec 05 20:37:57 crc kubenswrapper[4744]: I1205 20:37:57.839192 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher7cee-account-delete-wtflt" event={"ID":"d8c937e6-61c3-49e3-8350-77193a48de84","Type":"ContainerStarted","Data":"5a6ba0f6f31fee383f9ed7258ce9ae7f488cb4af2974f987f339761fc9266940"} Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.091854 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c90c470-2c0c-42bb-8aaa-2716399201bf" path="/var/lib/kubelet/pods/9c90c470-2c0c-42bb-8aaa-2716399201bf/volumes" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.319501 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.438805 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-config-data\") pod \"a4732639-fc4c-4637-858b-c343ddcfa41e\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.438887 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-custom-prometheus-ca\") pod \"a4732639-fc4c-4637-858b-c343ddcfa41e\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.438943 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-combined-ca-bundle\") pod \"a4732639-fc4c-4637-858b-c343ddcfa41e\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.438974 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xps49\" (UniqueName: \"kubernetes.io/projected/a4732639-fc4c-4637-858b-c343ddcfa41e-kube-api-access-xps49\") pod \"a4732639-fc4c-4637-858b-c343ddcfa41e\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.439071 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4732639-fc4c-4637-858b-c343ddcfa41e-logs\") pod \"a4732639-fc4c-4637-858b-c343ddcfa41e\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.439104 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-cert-memcached-mtls\") pod \"a4732639-fc4c-4637-858b-c343ddcfa41e\" (UID: \"a4732639-fc4c-4637-858b-c343ddcfa41e\") " Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.444645 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4732639-fc4c-4637-858b-c343ddcfa41e-logs" (OuterVolumeSpecName: "logs") pod "a4732639-fc4c-4637-858b-c343ddcfa41e" (UID: "a4732639-fc4c-4637-858b-c343ddcfa41e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.446404 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4732639-fc4c-4637-858b-c343ddcfa41e-kube-api-access-xps49" (OuterVolumeSpecName: "kube-api-access-xps49") pod "a4732639-fc4c-4637-858b-c343ddcfa41e" (UID: "a4732639-fc4c-4637-858b-c343ddcfa41e"). InnerVolumeSpecName "kube-api-access-xps49". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.467660 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "a4732639-fc4c-4637-858b-c343ddcfa41e" (UID: "a4732639-fc4c-4637-858b-c343ddcfa41e"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.476163 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4732639-fc4c-4637-858b-c343ddcfa41e" (UID: "a4732639-fc4c-4637-858b-c343ddcfa41e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.513398 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-config-data" (OuterVolumeSpecName: "config-data") pod "a4732639-fc4c-4637-858b-c343ddcfa41e" (UID: "a4732639-fc4c-4637-858b-c343ddcfa41e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.540684 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4732639-fc4c-4637-858b-c343ddcfa41e-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.540723 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.540737 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.540748 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.540760 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xps49\" (UniqueName: \"kubernetes.io/projected/a4732639-fc4c-4637-858b-c343ddcfa41e-kube-api-access-xps49\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.570466 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "a4732639-fc4c-4637-858b-c343ddcfa41e" (UID: "a4732639-fc4c-4637-858b-c343ddcfa41e"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.643143 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/a4732639-fc4c-4637-858b-c343ddcfa41e-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.848013 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"a4732639-fc4c-4637-858b-c343ddcfa41e","Type":"ContainerDied","Data":"74eda62d14c7d438bcfde80a14e75816b9c538768e17c09d208b2c2ac0bc90ed"} Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.848058 4744 scope.go:117] "RemoveContainer" containerID="c418d9e285b4f0257143aa336232c4f3f6c49fd91d401fe502cdcf154341c447" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.848152 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.853716 4744 generic.go:334] "Generic (PLEG): container finished" podID="4978e864-b896-4580-a9cf-796c8d465b8a" containerID="b5417a2f9d6daa4ff819c2650d9c6bee168a458c4c0ed892c21a7cd9ad34871e" exitCode=0 Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.853767 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"4978e864-b896-4580-a9cf-796c8d465b8a","Type":"ContainerDied","Data":"b5417a2f9d6daa4ff819c2650d9c6bee168a458c4c0ed892c21a7cd9ad34871e"} Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.853798 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"4978e864-b896-4580-a9cf-796c8d465b8a","Type":"ContainerDied","Data":"3e969e08d8aef610d8aa63c179413e1044d12985e6e6b5541434de068fed294c"} Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.853810 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e969e08d8aef610d8aa63c179413e1044d12985e6e6b5541434de068fed294c" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.893443 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.895017 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.895572 4744 scope.go:117] "RemoveContainer" containerID="c250032ef55cc455ef8aeca089af586a7c01e7a787597c6053aa1a85cd02d841" Dec 05 20:37:58 crc kubenswrapper[4744]: I1205 20:37:58.900994 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.060460 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zqld\" (UniqueName: \"kubernetes.io/projected/4978e864-b896-4580-a9cf-796c8d465b8a-kube-api-access-2zqld\") pod \"4978e864-b896-4580-a9cf-796c8d465b8a\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.060513 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-custom-prometheus-ca\") pod \"4978e864-b896-4580-a9cf-796c8d465b8a\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.060568 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-combined-ca-bundle\") pod \"4978e864-b896-4580-a9cf-796c8d465b8a\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.060634 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-cert-memcached-mtls\") pod \"4978e864-b896-4580-a9cf-796c8d465b8a\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.060688 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4978e864-b896-4580-a9cf-796c8d465b8a-logs\") pod \"4978e864-b896-4580-a9cf-796c8d465b8a\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.060722 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-config-data\") pod \"4978e864-b896-4580-a9cf-796c8d465b8a\" (UID: \"4978e864-b896-4580-a9cf-796c8d465b8a\") " Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.063409 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4978e864-b896-4580-a9cf-796c8d465b8a-kube-api-access-2zqld" (OuterVolumeSpecName: "kube-api-access-2zqld") pod "4978e864-b896-4580-a9cf-796c8d465b8a" (UID: "4978e864-b896-4580-a9cf-796c8d465b8a"). InnerVolumeSpecName "kube-api-access-2zqld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.063965 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4978e864-b896-4580-a9cf-796c8d465b8a-logs" (OuterVolumeSpecName: "logs") pod "4978e864-b896-4580-a9cf-796c8d465b8a" (UID: "4978e864-b896-4580-a9cf-796c8d465b8a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.085044 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "4978e864-b896-4580-a9cf-796c8d465b8a" (UID: "4978e864-b896-4580-a9cf-796c8d465b8a"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.086475 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4978e864-b896-4580-a9cf-796c8d465b8a" (UID: "4978e864-b896-4580-a9cf-796c8d465b8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.148336 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-config-data" (OuterVolumeSpecName: "config-data") pod "4978e864-b896-4580-a9cf-796c8d465b8a" (UID: "4978e864-b896-4580-a9cf-796c8d465b8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.149057 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "4978e864-b896-4580-a9cf-796c8d465b8a" (UID: "4978e864-b896-4580-a9cf-796c8d465b8a"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.162072 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.162100 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.162108 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4978e864-b896-4580-a9cf-796c8d465b8a-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.162117 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.162126 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zqld\" (UniqueName: \"kubernetes.io/projected/4978e864-b896-4580-a9cf-796c8d465b8a-kube-api-access-2zqld\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.162136 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4978e864-b896-4580-a9cf-796c8d465b8a-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.256398 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher7cee-account-delete-wtflt" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.364280 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c937e6-61c3-49e3-8350-77193a48de84-operator-scripts\") pod \"d8c937e6-61c3-49e3-8350-77193a48de84\" (UID: \"d8c937e6-61c3-49e3-8350-77193a48de84\") " Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.364389 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv9lc\" (UniqueName: \"kubernetes.io/projected/d8c937e6-61c3-49e3-8350-77193a48de84-kube-api-access-nv9lc\") pod \"d8c937e6-61c3-49e3-8350-77193a48de84\" (UID: \"d8c937e6-61c3-49e3-8350-77193a48de84\") " Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.364918 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c937e6-61c3-49e3-8350-77193a48de84-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8c937e6-61c3-49e3-8350-77193a48de84" (UID: "d8c937e6-61c3-49e3-8350-77193a48de84"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.367883 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c937e6-61c3-49e3-8350-77193a48de84-kube-api-access-nv9lc" (OuterVolumeSpecName: "kube-api-access-nv9lc") pod "d8c937e6-61c3-49e3-8350-77193a48de84" (UID: "d8c937e6-61c3-49e3-8350-77193a48de84"). InnerVolumeSpecName "kube-api-access-nv9lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.466574 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c937e6-61c3-49e3-8350-77193a48de84-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.466615 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv9lc\" (UniqueName: \"kubernetes.io/projected/d8c937e6-61c3-49e3-8350-77193a48de84-kube-api-access-nv9lc\") on node \"crc\" DevicePath \"\"" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.865161 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher7cee-account-delete-wtflt" event={"ID":"d8c937e6-61c3-49e3-8350-77193a48de84","Type":"ContainerDied","Data":"5a6ba0f6f31fee383f9ed7258ce9ae7f488cb4af2974f987f339761fc9266940"} Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.865188 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher7cee-account-delete-wtflt" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.865196 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a6ba0f6f31fee383f9ed7258ce9ae7f488cb4af2974f987f339761fc9266940" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.865201 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.904517 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:37:59 crc kubenswrapper[4744]: I1205 20:37:59.912040 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:38:00 crc kubenswrapper[4744]: I1205 20:38:00.090125 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4978e864-b896-4580-a9cf-796c8d465b8a" path="/var/lib/kubelet/pods/4978e864-b896-4580-a9cf-796c8d465b8a/volumes" Dec 05 20:38:00 crc kubenswrapper[4744]: I1205 20:38:00.090749 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4732639-fc4c-4637-858b-c343ddcfa41e" path="/var/lib/kubelet/pods/a4732639-fc4c-4637-858b-c343ddcfa41e/volumes" Dec 05 20:38:00 crc kubenswrapper[4744]: I1205 20:38:00.608825 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:38:00 crc kubenswrapper[4744]: I1205 20:38:00.610017 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="ceilometer-central-agent" containerID="cri-o://847d2d82802b98965065846dcd322715cb5a3d4195a42c59cd31636434890ea2" gracePeriod=30 Dec 05 20:38:00 crc kubenswrapper[4744]: I1205 20:38:00.610054 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="sg-core" containerID="cri-o://e8090a3f9caf693275b07f7c7f3c93bbd3f826265deceea34424a1d3b92f80d4" gracePeriod=30 Dec 05 20:38:00 crc kubenswrapper[4744]: I1205 20:38:00.610054 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="proxy-httpd" containerID="cri-o://b09e507612c6998e0072b8f559e69066a5f6be885be20900e3bf86653aa11e04" gracePeriod=30 Dec 05 20:38:00 crc kubenswrapper[4744]: I1205 20:38:00.610135 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="ceilometer-notification-agent" containerID="cri-o://75ab16526fd07710ff9155d2fae7387ee86298df5b61fc95b78298852500770a" gracePeriod=30 Dec 05 20:38:00 crc kubenswrapper[4744]: I1205 20:38:00.874647 4744 generic.go:334] "Generic (PLEG): container finished" podID="c02b7d0f-79b3-4d57-8599-b0a077a1747f" containerID="6f46e9b143e77a5a322e6a7547ecb495adf5875ee3c3fefaf8c84b30e61e1ff5" exitCode=0 Dec 05 20:38:00 crc kubenswrapper[4744]: I1205 20:38:00.874711 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"c02b7d0f-79b3-4d57-8599-b0a077a1747f","Type":"ContainerDied","Data":"6f46e9b143e77a5a322e6a7547ecb495adf5875ee3c3fefaf8c84b30e61e1ff5"} Dec 05 20:38:00 crc kubenswrapper[4744]: I1205 20:38:00.876570 4744 generic.go:334] "Generic (PLEG): container finished" podID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerID="b09e507612c6998e0072b8f559e69066a5f6be885be20900e3bf86653aa11e04" exitCode=0 Dec 05 20:38:00 crc kubenswrapper[4744]: I1205 20:38:00.876593 4744 generic.go:334] "Generic (PLEG): container finished" podID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerID="e8090a3f9caf693275b07f7c7f3c93bbd3f826265deceea34424a1d3b92f80d4" exitCode=2 Dec 05 20:38:00 crc kubenswrapper[4744]: I1205 20:38:00.876610 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"da229e28-5d19-4f4b-afab-3fd7a27ea9b1","Type":"ContainerDied","Data":"b09e507612c6998e0072b8f559e69066a5f6be885be20900e3bf86653aa11e04"} Dec 05 20:38:00 crc kubenswrapper[4744]: I1205 20:38:00.876627 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"da229e28-5d19-4f4b-afab-3fd7a27ea9b1","Type":"ContainerDied","Data":"e8090a3f9caf693275b07f7c7f3c93bbd3f826265deceea34424a1d3b92f80d4"} Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.136020 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.296267 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-config-data\") pod \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.296401 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c02b7d0f-79b3-4d57-8599-b0a077a1747f-logs\") pod \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.296493 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rspqh\" (UniqueName: \"kubernetes.io/projected/c02b7d0f-79b3-4d57-8599-b0a077a1747f-kube-api-access-rspqh\") pod \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.296521 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-cert-memcached-mtls\") pod \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.296573 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-combined-ca-bundle\") pod \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\" (UID: \"c02b7d0f-79b3-4d57-8599-b0a077a1747f\") " Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.296778 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c02b7d0f-79b3-4d57-8599-b0a077a1747f-logs" (OuterVolumeSpecName: "logs") pod "c02b7d0f-79b3-4d57-8599-b0a077a1747f" (UID: "c02b7d0f-79b3-4d57-8599-b0a077a1747f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.296857 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c02b7d0f-79b3-4d57-8599-b0a077a1747f-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.301614 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c02b7d0f-79b3-4d57-8599-b0a077a1747f-kube-api-access-rspqh" (OuterVolumeSpecName: "kube-api-access-rspqh") pod "c02b7d0f-79b3-4d57-8599-b0a077a1747f" (UID: "c02b7d0f-79b3-4d57-8599-b0a077a1747f"). InnerVolumeSpecName "kube-api-access-rspqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.325819 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c02b7d0f-79b3-4d57-8599-b0a077a1747f" (UID: "c02b7d0f-79b3-4d57-8599-b0a077a1747f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.344228 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-config-data" (OuterVolumeSpecName: "config-data") pod "c02b7d0f-79b3-4d57-8599-b0a077a1747f" (UID: "c02b7d0f-79b3-4d57-8599-b0a077a1747f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.357228 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "c02b7d0f-79b3-4d57-8599-b0a077a1747f" (UID: "c02b7d0f-79b3-4d57-8599-b0a077a1747f"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.398564 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.398608 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.398622 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rspqh\" (UniqueName: \"kubernetes.io/projected/c02b7d0f-79b3-4d57-8599-b0a077a1747f-kube-api-access-rspqh\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.398635 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/c02b7d0f-79b3-4d57-8599-b0a077a1747f-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.432003 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-cpjvn"] Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.441545 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-cpjvn"] Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.450588 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m"] Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.466749 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher7cee-account-delete-wtflt"] Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.473688 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-7cee-account-create-update-hdb9m"] Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.480508 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher7cee-account-delete-wtflt"] Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.886629 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"c02b7d0f-79b3-4d57-8599-b0a077a1747f","Type":"ContainerDied","Data":"d7429fa526fb56ccc4c5cb056c08c22cf979b6c31b95295d12b8e60d8d70cd15"} Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.886639 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.886688 4744 scope.go:117] "RemoveContainer" containerID="6f46e9b143e77a5a322e6a7547ecb495adf5875ee3c3fefaf8c84b30e61e1ff5" Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.889869 4744 generic.go:334] "Generic (PLEG): container finished" podID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerID="847d2d82802b98965065846dcd322715cb5a3d4195a42c59cd31636434890ea2" exitCode=0 Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.890085 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"da229e28-5d19-4f4b-afab-3fd7a27ea9b1","Type":"ContainerDied","Data":"847d2d82802b98965065846dcd322715cb5a3d4195a42c59cd31636434890ea2"} Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.928227 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:38:01 crc kubenswrapper[4744]: I1205 20:38:01.933610 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:38:02 crc kubenswrapper[4744]: I1205 20:38:02.091101 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda" path="/var/lib/kubelet/pods/0fcd02e8-8181-4dd1-aa8e-eb50ed18aeda/volumes" Dec 05 20:38:02 crc kubenswrapper[4744]: I1205 20:38:02.091621 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9629e785-9003-4e51-9e0d-3081a14b6003" path="/var/lib/kubelet/pods/9629e785-9003-4e51-9e0d-3081a14b6003/volumes" Dec 05 20:38:02 crc kubenswrapper[4744]: I1205 20:38:02.092105 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c02b7d0f-79b3-4d57-8599-b0a077a1747f" path="/var/lib/kubelet/pods/c02b7d0f-79b3-4d57-8599-b0a077a1747f/volumes" Dec 05 20:38:02 crc kubenswrapper[4744]: I1205 20:38:02.093022 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c937e6-61c3-49e3-8350-77193a48de84" path="/var/lib/kubelet/pods/d8c937e6-61c3-49e3-8350-77193a48de84/volumes" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.127886 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-876kl"] Dec 05 20:38:03 crc kubenswrapper[4744]: E1205 20:38:03.128530 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4978e864-b896-4580-a9cf-796c8d465b8a" containerName="watcher-decision-engine" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.128545 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4978e864-b896-4580-a9cf-796c8d465b8a" containerName="watcher-decision-engine" Dec 05 20:38:03 crc kubenswrapper[4744]: E1205 20:38:03.128574 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c937e6-61c3-49e3-8350-77193a48de84" containerName="mariadb-account-delete" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.128580 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c937e6-61c3-49e3-8350-77193a48de84" containerName="mariadb-account-delete" Dec 05 20:38:03 crc kubenswrapper[4744]: E1205 20:38:03.128597 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4732639-fc4c-4637-858b-c343ddcfa41e" containerName="watcher-api" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.128607 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4732639-fc4c-4637-858b-c343ddcfa41e" containerName="watcher-api" Dec 05 20:38:03 crc kubenswrapper[4744]: E1205 20:38:03.128620 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c02b7d0f-79b3-4d57-8599-b0a077a1747f" containerName="watcher-applier" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.128628 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c02b7d0f-79b3-4d57-8599-b0a077a1747f" containerName="watcher-applier" Dec 05 20:38:03 crc kubenswrapper[4744]: E1205 20:38:03.128638 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4732639-fc4c-4637-858b-c343ddcfa41e" containerName="watcher-kuttl-api-log" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.128645 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4732639-fc4c-4637-858b-c343ddcfa41e" containerName="watcher-kuttl-api-log" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.129121 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c02b7d0f-79b3-4d57-8599-b0a077a1747f" containerName="watcher-applier" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.129143 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4732639-fc4c-4637-858b-c343ddcfa41e" containerName="watcher-api" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.129157 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4732639-fc4c-4637-858b-c343ddcfa41e" containerName="watcher-kuttl-api-log" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.129177 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4978e864-b896-4580-a9cf-796c8d465b8a" containerName="watcher-decision-engine" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.129192 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c937e6-61c3-49e3-8350-77193a48de84" containerName="mariadb-account-delete" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.130617 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-876kl" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.157618 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5"] Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.158727 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.161285 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.165673 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-876kl"] Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.178416 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5"] Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.226583 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e57f5b4f-fe7b-406e-ac8f-c934b2149ae4-operator-scripts\") pod \"watcher-db-create-876kl\" (UID: \"e57f5b4f-fe7b-406e-ac8f-c934b2149ae4\") " pod="watcher-kuttl-default/watcher-db-create-876kl" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.226634 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2d6f\" (UniqueName: \"kubernetes.io/projected/e57f5b4f-fe7b-406e-ac8f-c934b2149ae4-kube-api-access-l2d6f\") pod \"watcher-db-create-876kl\" (UID: \"e57f5b4f-fe7b-406e-ac8f-c934b2149ae4\") " pod="watcher-kuttl-default/watcher-db-create-876kl" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.328669 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqrwz\" (UniqueName: \"kubernetes.io/projected/ea3ec182-e3b5-464b-b973-e2599aff944c-kube-api-access-kqrwz\") pod \"watcher-58b3-account-create-update-6dpz5\" (UID: \"ea3ec182-e3b5-464b-b973-e2599aff944c\") " pod="watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.328783 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e57f5b4f-fe7b-406e-ac8f-c934b2149ae4-operator-scripts\") pod \"watcher-db-create-876kl\" (UID: \"e57f5b4f-fe7b-406e-ac8f-c934b2149ae4\") " pod="watcher-kuttl-default/watcher-db-create-876kl" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.328810 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2d6f\" (UniqueName: \"kubernetes.io/projected/e57f5b4f-fe7b-406e-ac8f-c934b2149ae4-kube-api-access-l2d6f\") pod \"watcher-db-create-876kl\" (UID: \"e57f5b4f-fe7b-406e-ac8f-c934b2149ae4\") " pod="watcher-kuttl-default/watcher-db-create-876kl" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.328825 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea3ec182-e3b5-464b-b973-e2599aff944c-operator-scripts\") pod \"watcher-58b3-account-create-update-6dpz5\" (UID: \"ea3ec182-e3b5-464b-b973-e2599aff944c\") " pod="watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.329450 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e57f5b4f-fe7b-406e-ac8f-c934b2149ae4-operator-scripts\") pod \"watcher-db-create-876kl\" (UID: \"e57f5b4f-fe7b-406e-ac8f-c934b2149ae4\") " pod="watcher-kuttl-default/watcher-db-create-876kl" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.351433 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2d6f\" (UniqueName: \"kubernetes.io/projected/e57f5b4f-fe7b-406e-ac8f-c934b2149ae4-kube-api-access-l2d6f\") pod \"watcher-db-create-876kl\" (UID: \"e57f5b4f-fe7b-406e-ac8f-c934b2149ae4\") " pod="watcher-kuttl-default/watcher-db-create-876kl" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.430462 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqrwz\" (UniqueName: \"kubernetes.io/projected/ea3ec182-e3b5-464b-b973-e2599aff944c-kube-api-access-kqrwz\") pod \"watcher-58b3-account-create-update-6dpz5\" (UID: \"ea3ec182-e3b5-464b-b973-e2599aff944c\") " pod="watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.430941 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea3ec182-e3b5-464b-b973-e2599aff944c-operator-scripts\") pod \"watcher-58b3-account-create-update-6dpz5\" (UID: \"ea3ec182-e3b5-464b-b973-e2599aff944c\") " pod="watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.431632 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea3ec182-e3b5-464b-b973-e2599aff944c-operator-scripts\") pod \"watcher-58b3-account-create-update-6dpz5\" (UID: \"ea3ec182-e3b5-464b-b973-e2599aff944c\") " pod="watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.448728 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqrwz\" (UniqueName: \"kubernetes.io/projected/ea3ec182-e3b5-464b-b973-e2599aff944c-kube-api-access-kqrwz\") pod \"watcher-58b3-account-create-update-6dpz5\" (UID: \"ea3ec182-e3b5-464b-b973-e2599aff944c\") " pod="watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.459718 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-876kl" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.499894 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5" Dec 05 20:38:03 crc kubenswrapper[4744]: I1205 20:38:03.933920 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-876kl"] Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.128623 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5"] Dec 05 20:38:04 crc kubenswrapper[4744]: W1205 20:38:04.136982 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea3ec182_e3b5_464b_b973_e2599aff944c.slice/crio-0569d7d13e5bdf62f8766628065b053fb09fe367ed504d5ccfd4ba7d6d00b30f WatchSource:0}: Error finding container 0569d7d13e5bdf62f8766628065b053fb09fe367ed504d5ccfd4ba7d6d00b30f: Status 404 returned error can't find the container with id 0569d7d13e5bdf62f8766628065b053fb09fe367ed504d5ccfd4ba7d6d00b30f Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.741972 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.859776 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkslg\" (UniqueName: \"kubernetes.io/projected/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-kube-api-access-qkslg\") pod \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.859878 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-scripts\") pod \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.859909 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-ceilometer-tls-certs\") pod \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.859948 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-log-httpd\") pod \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.859977 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-combined-ca-bundle\") pod \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.860066 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-sg-core-conf-yaml\") pod \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.860126 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-config-data\") pod \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.860178 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-run-httpd\") pod \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\" (UID: \"da229e28-5d19-4f4b-afab-3fd7a27ea9b1\") " Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.860867 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "da229e28-5d19-4f4b-afab-3fd7a27ea9b1" (UID: "da229e28-5d19-4f4b-afab-3fd7a27ea9b1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.861439 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "da229e28-5d19-4f4b-afab-3fd7a27ea9b1" (UID: "da229e28-5d19-4f4b-afab-3fd7a27ea9b1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.873216 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-kube-api-access-qkslg" (OuterVolumeSpecName: "kube-api-access-qkslg") pod "da229e28-5d19-4f4b-afab-3fd7a27ea9b1" (UID: "da229e28-5d19-4f4b-afab-3fd7a27ea9b1"). InnerVolumeSpecName "kube-api-access-qkslg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.882426 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-scripts" (OuterVolumeSpecName: "scripts") pod "da229e28-5d19-4f4b-afab-3fd7a27ea9b1" (UID: "da229e28-5d19-4f4b-afab-3fd7a27ea9b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.890380 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "da229e28-5d19-4f4b-afab-3fd7a27ea9b1" (UID: "da229e28-5d19-4f4b-afab-3fd7a27ea9b1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.917355 4744 generic.go:334] "Generic (PLEG): container finished" podID="e57f5b4f-fe7b-406e-ac8f-c934b2149ae4" containerID="f7f0337b8e3263e4aac2458de206366665aea3ca88a1b92883c2f0a74b496d99" exitCode=0 Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.917412 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-876kl" event={"ID":"e57f5b4f-fe7b-406e-ac8f-c934b2149ae4","Type":"ContainerDied","Data":"f7f0337b8e3263e4aac2458de206366665aea3ca88a1b92883c2f0a74b496d99"} Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.917439 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-876kl" event={"ID":"e57f5b4f-fe7b-406e-ac8f-c934b2149ae4","Type":"ContainerStarted","Data":"0dacd2510285ea07fe011b9056aec30fadaa5fcbc1b9fcba48a43e566560a625"} Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.933612 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "da229e28-5d19-4f4b-afab-3fd7a27ea9b1" (UID: "da229e28-5d19-4f4b-afab-3fd7a27ea9b1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.941554 4744 generic.go:334] "Generic (PLEG): container finished" podID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerID="75ab16526fd07710ff9155d2fae7387ee86298df5b61fc95b78298852500770a" exitCode=0 Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.941657 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"da229e28-5d19-4f4b-afab-3fd7a27ea9b1","Type":"ContainerDied","Data":"75ab16526fd07710ff9155d2fae7387ee86298df5b61fc95b78298852500770a"} Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.941692 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"da229e28-5d19-4f4b-afab-3fd7a27ea9b1","Type":"ContainerDied","Data":"7cf2a579c7ff0875faeecfb51381d003a89f448293284ad2a3b8fc1c748f9b8f"} Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.941695 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.941709 4744 scope.go:117] "RemoveContainer" containerID="b09e507612c6998e0072b8f559e69066a5f6be885be20900e3bf86653aa11e04" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.945037 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da229e28-5d19-4f4b-afab-3fd7a27ea9b1" (UID: "da229e28-5d19-4f4b-afab-3fd7a27ea9b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.949038 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5" event={"ID":"ea3ec182-e3b5-464b-b973-e2599aff944c","Type":"ContainerStarted","Data":"f39cdcde712ce7599672b1d10ffc0ef29b126b35da5654e8619e591f3bd328b5"} Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.949095 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5" event={"ID":"ea3ec182-e3b5-464b-b973-e2599aff944c","Type":"ContainerStarted","Data":"0569d7d13e5bdf62f8766628065b053fb09fe367ed504d5ccfd4ba7d6d00b30f"} Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.961782 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.961805 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.961814 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.961843 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkslg\" (UniqueName: \"kubernetes.io/projected/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-kube-api-access-qkslg\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.961851 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.961859 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.961866 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.971044 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5" podStartSLOduration=1.9710226450000001 podStartE2EDuration="1.971022645s" podCreationTimestamp="2025-12-05 20:38:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:38:04.966308919 +0000 UTC m=+1655.196120287" watchObservedRunningTime="2025-12-05 20:38:04.971022645 +0000 UTC m=+1655.200834013" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.975158 4744 scope.go:117] "RemoveContainer" containerID="e8090a3f9caf693275b07f7c7f3c93bbd3f826265deceea34424a1d3b92f80d4" Dec 05 20:38:04 crc kubenswrapper[4744]: I1205 20:38:04.985951 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-config-data" (OuterVolumeSpecName: "config-data") pod "da229e28-5d19-4f4b-afab-3fd7a27ea9b1" (UID: "da229e28-5d19-4f4b-afab-3fd7a27ea9b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.004097 4744 scope.go:117] "RemoveContainer" containerID="75ab16526fd07710ff9155d2fae7387ee86298df5b61fc95b78298852500770a" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.033229 4744 scope.go:117] "RemoveContainer" containerID="847d2d82802b98965065846dcd322715cb5a3d4195a42c59cd31636434890ea2" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.063529 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da229e28-5d19-4f4b-afab-3fd7a27ea9b1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.087230 4744 scope.go:117] "RemoveContainer" containerID="b09e507612c6998e0072b8f559e69066a5f6be885be20900e3bf86653aa11e04" Dec 05 20:38:05 crc kubenswrapper[4744]: E1205 20:38:05.087835 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b09e507612c6998e0072b8f559e69066a5f6be885be20900e3bf86653aa11e04\": container with ID starting with b09e507612c6998e0072b8f559e69066a5f6be885be20900e3bf86653aa11e04 not found: ID does not exist" containerID="b09e507612c6998e0072b8f559e69066a5f6be885be20900e3bf86653aa11e04" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.087884 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b09e507612c6998e0072b8f559e69066a5f6be885be20900e3bf86653aa11e04"} err="failed to get container status \"b09e507612c6998e0072b8f559e69066a5f6be885be20900e3bf86653aa11e04\": rpc error: code = NotFound desc = could not find container \"b09e507612c6998e0072b8f559e69066a5f6be885be20900e3bf86653aa11e04\": container with ID starting with b09e507612c6998e0072b8f559e69066a5f6be885be20900e3bf86653aa11e04 not found: ID does not exist" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.087922 4744 scope.go:117] "RemoveContainer" containerID="e8090a3f9caf693275b07f7c7f3c93bbd3f826265deceea34424a1d3b92f80d4" Dec 05 20:38:05 crc kubenswrapper[4744]: E1205 20:38:05.088400 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8090a3f9caf693275b07f7c7f3c93bbd3f826265deceea34424a1d3b92f80d4\": container with ID starting with e8090a3f9caf693275b07f7c7f3c93bbd3f826265deceea34424a1d3b92f80d4 not found: ID does not exist" containerID="e8090a3f9caf693275b07f7c7f3c93bbd3f826265deceea34424a1d3b92f80d4" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.088436 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8090a3f9caf693275b07f7c7f3c93bbd3f826265deceea34424a1d3b92f80d4"} err="failed to get container status \"e8090a3f9caf693275b07f7c7f3c93bbd3f826265deceea34424a1d3b92f80d4\": rpc error: code = NotFound desc = could not find container \"e8090a3f9caf693275b07f7c7f3c93bbd3f826265deceea34424a1d3b92f80d4\": container with ID starting with e8090a3f9caf693275b07f7c7f3c93bbd3f826265deceea34424a1d3b92f80d4 not found: ID does not exist" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.088463 4744 scope.go:117] "RemoveContainer" containerID="75ab16526fd07710ff9155d2fae7387ee86298df5b61fc95b78298852500770a" Dec 05 20:38:05 crc kubenswrapper[4744]: E1205 20:38:05.088746 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75ab16526fd07710ff9155d2fae7387ee86298df5b61fc95b78298852500770a\": container with ID starting with 75ab16526fd07710ff9155d2fae7387ee86298df5b61fc95b78298852500770a not found: ID does not exist" containerID="75ab16526fd07710ff9155d2fae7387ee86298df5b61fc95b78298852500770a" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.088774 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ab16526fd07710ff9155d2fae7387ee86298df5b61fc95b78298852500770a"} err="failed to get container status \"75ab16526fd07710ff9155d2fae7387ee86298df5b61fc95b78298852500770a\": rpc error: code = NotFound desc = could not find container \"75ab16526fd07710ff9155d2fae7387ee86298df5b61fc95b78298852500770a\": container with ID starting with 75ab16526fd07710ff9155d2fae7387ee86298df5b61fc95b78298852500770a not found: ID does not exist" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.088793 4744 scope.go:117] "RemoveContainer" containerID="847d2d82802b98965065846dcd322715cb5a3d4195a42c59cd31636434890ea2" Dec 05 20:38:05 crc kubenswrapper[4744]: E1205 20:38:05.089022 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"847d2d82802b98965065846dcd322715cb5a3d4195a42c59cd31636434890ea2\": container with ID starting with 847d2d82802b98965065846dcd322715cb5a3d4195a42c59cd31636434890ea2 not found: ID does not exist" containerID="847d2d82802b98965065846dcd322715cb5a3d4195a42c59cd31636434890ea2" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.089048 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"847d2d82802b98965065846dcd322715cb5a3d4195a42c59cd31636434890ea2"} err="failed to get container status \"847d2d82802b98965065846dcd322715cb5a3d4195a42c59cd31636434890ea2\": rpc error: code = NotFound desc = could not find container \"847d2d82802b98965065846dcd322715cb5a3d4195a42c59cd31636434890ea2\": container with ID starting with 847d2d82802b98965065846dcd322715cb5a3d4195a42c59cd31636434890ea2 not found: ID does not exist" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.279339 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.286899 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.303672 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:38:05 crc kubenswrapper[4744]: E1205 20:38:05.304195 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="ceilometer-notification-agent" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.304310 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="ceilometer-notification-agent" Dec 05 20:38:05 crc kubenswrapper[4744]: E1205 20:38:05.304397 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="sg-core" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.304458 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="sg-core" Dec 05 20:38:05 crc kubenswrapper[4744]: E1205 20:38:05.304521 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="proxy-httpd" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.304579 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="proxy-httpd" Dec 05 20:38:05 crc kubenswrapper[4744]: E1205 20:38:05.304650 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="ceilometer-central-agent" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.304702 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="ceilometer-central-agent" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.304914 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="ceilometer-central-agent" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.304996 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="ceilometer-notification-agent" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.305063 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="proxy-httpd" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.305129 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" containerName="sg-core" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.307056 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.310914 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.311005 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.311249 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.318571 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.469949 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.470234 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ecbe775-7079-493d-93f5-b7f1e34e74a7-log-httpd\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.470402 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.470456 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-879tt\" (UniqueName: \"kubernetes.io/projected/2ecbe775-7079-493d-93f5-b7f1e34e74a7-kube-api-access-879tt\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.470527 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-scripts\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.470599 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ecbe775-7079-493d-93f5-b7f1e34e74a7-run-httpd\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.470627 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-config-data\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.470685 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.572043 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-879tt\" (UniqueName: \"kubernetes.io/projected/2ecbe775-7079-493d-93f5-b7f1e34e74a7-kube-api-access-879tt\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.572121 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-scripts\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.572158 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ecbe775-7079-493d-93f5-b7f1e34e74a7-run-httpd\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.572176 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-config-data\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.572203 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.572285 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.572329 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ecbe775-7079-493d-93f5-b7f1e34e74a7-log-httpd\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.572371 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.575992 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ecbe775-7079-493d-93f5-b7f1e34e74a7-log-httpd\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.576225 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ecbe775-7079-493d-93f5-b7f1e34e74a7-run-httpd\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.576661 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.576968 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-config-data\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.579006 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-scripts\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.579065 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.584183 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.587686 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-879tt\" (UniqueName: \"kubernetes.io/projected/2ecbe775-7079-493d-93f5-b7f1e34e74a7-kube-api-access-879tt\") pod \"ceilometer-0\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.625531 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.957920 4744 generic.go:334] "Generic (PLEG): container finished" podID="ea3ec182-e3b5-464b-b973-e2599aff944c" containerID="f39cdcde712ce7599672b1d10ffc0ef29b126b35da5654e8619e591f3bd328b5" exitCode=0 Dec 05 20:38:05 crc kubenswrapper[4744]: I1205 20:38:05.957990 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5" event={"ID":"ea3ec182-e3b5-464b-b973-e2599aff944c","Type":"ContainerDied","Data":"f39cdcde712ce7599672b1d10ffc0ef29b126b35da5654e8619e591f3bd328b5"} Dec 05 20:38:06 crc kubenswrapper[4744]: I1205 20:38:06.074996 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:38:06 crc kubenswrapper[4744]: I1205 20:38:06.093806 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:38:06 crc kubenswrapper[4744]: I1205 20:38:06.114270 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da229e28-5d19-4f4b-afab-3fd7a27ea9b1" path="/var/lib/kubelet/pods/da229e28-5d19-4f4b-afab-3fd7a27ea9b1/volumes" Dec 05 20:38:06 crc kubenswrapper[4744]: I1205 20:38:06.259891 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-876kl" Dec 05 20:38:06 crc kubenswrapper[4744]: I1205 20:38:06.385325 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e57f5b4f-fe7b-406e-ac8f-c934b2149ae4-operator-scripts\") pod \"e57f5b4f-fe7b-406e-ac8f-c934b2149ae4\" (UID: \"e57f5b4f-fe7b-406e-ac8f-c934b2149ae4\") " Dec 05 20:38:06 crc kubenswrapper[4744]: I1205 20:38:06.385398 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2d6f\" (UniqueName: \"kubernetes.io/projected/e57f5b4f-fe7b-406e-ac8f-c934b2149ae4-kube-api-access-l2d6f\") pod \"e57f5b4f-fe7b-406e-ac8f-c934b2149ae4\" (UID: \"e57f5b4f-fe7b-406e-ac8f-c934b2149ae4\") " Dec 05 20:38:06 crc kubenswrapper[4744]: I1205 20:38:06.386547 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e57f5b4f-fe7b-406e-ac8f-c934b2149ae4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e57f5b4f-fe7b-406e-ac8f-c934b2149ae4" (UID: "e57f5b4f-fe7b-406e-ac8f-c934b2149ae4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:38:06 crc kubenswrapper[4744]: I1205 20:38:06.390311 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e57f5b4f-fe7b-406e-ac8f-c934b2149ae4-kube-api-access-l2d6f" (OuterVolumeSpecName: "kube-api-access-l2d6f") pod "e57f5b4f-fe7b-406e-ac8f-c934b2149ae4" (UID: "e57f5b4f-fe7b-406e-ac8f-c934b2149ae4"). InnerVolumeSpecName "kube-api-access-l2d6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:38:06 crc kubenswrapper[4744]: I1205 20:38:06.487588 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2d6f\" (UniqueName: \"kubernetes.io/projected/e57f5b4f-fe7b-406e-ac8f-c934b2149ae4-kube-api-access-l2d6f\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:06 crc kubenswrapper[4744]: I1205 20:38:06.487886 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e57f5b4f-fe7b-406e-ac8f-c934b2149ae4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:06 crc kubenswrapper[4744]: I1205 20:38:06.967326 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-876kl" event={"ID":"e57f5b4f-fe7b-406e-ac8f-c934b2149ae4","Type":"ContainerDied","Data":"0dacd2510285ea07fe011b9056aec30fadaa5fcbc1b9fcba48a43e566560a625"} Dec 05 20:38:06 crc kubenswrapper[4744]: I1205 20:38:06.967603 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dacd2510285ea07fe011b9056aec30fadaa5fcbc1b9fcba48a43e566560a625" Dec 05 20:38:06 crc kubenswrapper[4744]: I1205 20:38:06.967647 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-876kl" Dec 05 20:38:06 crc kubenswrapper[4744]: I1205 20:38:06.974865 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2ecbe775-7079-493d-93f5-b7f1e34e74a7","Type":"ContainerStarted","Data":"fbfc1e53669be73a0da327f7d69dbd5783cfc751de5d4fee06e32c0a7b44ceca"} Dec 05 20:38:06 crc kubenswrapper[4744]: I1205 20:38:06.974913 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2ecbe775-7079-493d-93f5-b7f1e34e74a7","Type":"ContainerStarted","Data":"c2e1d1372c3b6572abdcf262b3334d66809fc5230685d4fc7c2ca8f329127491"} Dec 05 20:38:07 crc kubenswrapper[4744]: I1205 20:38:07.270642 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5" Dec 05 20:38:07 crc kubenswrapper[4744]: I1205 20:38:07.416015 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqrwz\" (UniqueName: \"kubernetes.io/projected/ea3ec182-e3b5-464b-b973-e2599aff944c-kube-api-access-kqrwz\") pod \"ea3ec182-e3b5-464b-b973-e2599aff944c\" (UID: \"ea3ec182-e3b5-464b-b973-e2599aff944c\") " Dec 05 20:38:07 crc kubenswrapper[4744]: I1205 20:38:07.416094 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea3ec182-e3b5-464b-b973-e2599aff944c-operator-scripts\") pod \"ea3ec182-e3b5-464b-b973-e2599aff944c\" (UID: \"ea3ec182-e3b5-464b-b973-e2599aff944c\") " Dec 05 20:38:07 crc kubenswrapper[4744]: I1205 20:38:07.416945 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea3ec182-e3b5-464b-b973-e2599aff944c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea3ec182-e3b5-464b-b973-e2599aff944c" (UID: "ea3ec182-e3b5-464b-b973-e2599aff944c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:38:07 crc kubenswrapper[4744]: I1205 20:38:07.422026 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3ec182-e3b5-464b-b973-e2599aff944c-kube-api-access-kqrwz" (OuterVolumeSpecName: "kube-api-access-kqrwz") pod "ea3ec182-e3b5-464b-b973-e2599aff944c" (UID: "ea3ec182-e3b5-464b-b973-e2599aff944c"). InnerVolumeSpecName "kube-api-access-kqrwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:38:07 crc kubenswrapper[4744]: I1205 20:38:07.517875 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqrwz\" (UniqueName: \"kubernetes.io/projected/ea3ec182-e3b5-464b-b973-e2599aff944c-kube-api-access-kqrwz\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:07 crc kubenswrapper[4744]: I1205 20:38:07.517914 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea3ec182-e3b5-464b-b973-e2599aff944c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:07 crc kubenswrapper[4744]: I1205 20:38:07.984697 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5" event={"ID":"ea3ec182-e3b5-464b-b973-e2599aff944c","Type":"ContainerDied","Data":"0569d7d13e5bdf62f8766628065b053fb09fe367ed504d5ccfd4ba7d6d00b30f"} Dec 05 20:38:07 crc kubenswrapper[4744]: I1205 20:38:07.985037 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0569d7d13e5bdf62f8766628065b053fb09fe367ed504d5ccfd4ba7d6d00b30f" Dec 05 20:38:07 crc kubenswrapper[4744]: I1205 20:38:07.984715 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5" Dec 05 20:38:07 crc kubenswrapper[4744]: I1205 20:38:07.986557 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2ecbe775-7079-493d-93f5-b7f1e34e74a7","Type":"ContainerStarted","Data":"ca4e53d9dcdaf7f152a6991d5ee747556ca7a1e66c15fe48cb5ac1999527ef07"} Dec 05 20:38:08 crc kubenswrapper[4744]: I1205 20:38:08.997654 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2ecbe775-7079-493d-93f5-b7f1e34e74a7","Type":"ContainerStarted","Data":"101021aea0b7d201da2eba15934e06f7a98a6d9c5c5d29ece5c696525da88a77"} Dec 05 20:38:10 crc kubenswrapper[4744]: I1205 20:38:10.016480 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2ecbe775-7079-493d-93f5-b7f1e34e74a7","Type":"ContainerStarted","Data":"ace9d6d3047d0c9f8de74e1e02c756498ad63a7297d60895865fa2e272ba5b47"} Dec 05 20:38:10 crc kubenswrapper[4744]: I1205 20:38:10.016899 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:10 crc kubenswrapper[4744]: I1205 20:38:10.044063 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.951416971 podStartE2EDuration="5.04404467s" podCreationTimestamp="2025-12-05 20:38:05 +0000 UTC" firstStartedPulling="2025-12-05 20:38:06.09357197 +0000 UTC m=+1656.323383338" lastFinishedPulling="2025-12-05 20:38:09.186199659 +0000 UTC m=+1659.416011037" observedRunningTime="2025-12-05 20:38:10.04362235 +0000 UTC m=+1660.273433738" watchObservedRunningTime="2025-12-05 20:38:10.04404467 +0000 UTC m=+1660.273856038" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.398959 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm"] Dec 05 20:38:13 crc kubenswrapper[4744]: E1205 20:38:13.399591 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3ec182-e3b5-464b-b973-e2599aff944c" containerName="mariadb-account-create-update" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.399602 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3ec182-e3b5-464b-b973-e2599aff944c" containerName="mariadb-account-create-update" Dec 05 20:38:13 crc kubenswrapper[4744]: E1205 20:38:13.399615 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57f5b4f-fe7b-406e-ac8f-c934b2149ae4" containerName="mariadb-database-create" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.399621 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57f5b4f-fe7b-406e-ac8f-c934b2149ae4" containerName="mariadb-database-create" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.399779 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3ec182-e3b5-464b-b973-e2599aff944c" containerName="mariadb-account-create-update" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.399792 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57f5b4f-fe7b-406e-ac8f-c934b2149ae4" containerName="mariadb-database-create" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.400275 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.404185 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.404185 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-w6j2f" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.416700 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm"] Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.523263 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcwgq\" (UniqueName: \"kubernetes.io/projected/3c1fb22d-2926-4e72-946d-164071db6f9a-kube-api-access-jcwgq\") pod \"watcher-kuttl-db-sync-zp8zm\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.523320 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-zp8zm\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.523343 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-config-data\") pod \"watcher-kuttl-db-sync-zp8zm\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.523694 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-db-sync-config-data\") pod \"watcher-kuttl-db-sync-zp8zm\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.630388 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-db-sync-config-data\") pod \"watcher-kuttl-db-sync-zp8zm\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.630588 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcwgq\" (UniqueName: \"kubernetes.io/projected/3c1fb22d-2926-4e72-946d-164071db6f9a-kube-api-access-jcwgq\") pod \"watcher-kuttl-db-sync-zp8zm\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.630643 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-zp8zm\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.630676 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-config-data\") pod \"watcher-kuttl-db-sync-zp8zm\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.636014 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-zp8zm\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.642853 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-db-sync-config-data\") pod \"watcher-kuttl-db-sync-zp8zm\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.658054 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcwgq\" (UniqueName: \"kubernetes.io/projected/3c1fb22d-2926-4e72-946d-164071db6f9a-kube-api-access-jcwgq\") pod \"watcher-kuttl-db-sync-zp8zm\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.664938 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-config-data\") pod \"watcher-kuttl-db-sync-zp8zm\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:13 crc kubenswrapper[4744]: I1205 20:38:13.722710 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:14 crc kubenswrapper[4744]: I1205 20:38:14.278472 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm"] Dec 05 20:38:15 crc kubenswrapper[4744]: I1205 20:38:15.059150 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" event={"ID":"3c1fb22d-2926-4e72-946d-164071db6f9a","Type":"ContainerStarted","Data":"61d8e1ee54fa5ae2bb38cedfe1e6cd1c7ec76f3bd9a9db2c2ee2a46e4f6eacc8"} Dec 05 20:38:15 crc kubenswrapper[4744]: I1205 20:38:15.059418 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" event={"ID":"3c1fb22d-2926-4e72-946d-164071db6f9a","Type":"ContainerStarted","Data":"635cad6f72834f30a842f73b1872a478e5c19b1f416b0c0a20bb7e2f8e4b2e6c"} Dec 05 20:38:15 crc kubenswrapper[4744]: I1205 20:38:15.085718 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" podStartSLOduration=2.0856964749999998 podStartE2EDuration="2.085696475s" podCreationTimestamp="2025-12-05 20:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:38:15.077039842 +0000 UTC m=+1665.306851210" watchObservedRunningTime="2025-12-05 20:38:15.085696475 +0000 UTC m=+1665.315507853" Dec 05 20:38:17 crc kubenswrapper[4744]: I1205 20:38:17.084147 4744 generic.go:334] "Generic (PLEG): container finished" podID="3c1fb22d-2926-4e72-946d-164071db6f9a" containerID="61d8e1ee54fa5ae2bb38cedfe1e6cd1c7ec76f3bd9a9db2c2ee2a46e4f6eacc8" exitCode=0 Dec 05 20:38:17 crc kubenswrapper[4744]: I1205 20:38:17.084332 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" event={"ID":"3c1fb22d-2926-4e72-946d-164071db6f9a","Type":"ContainerDied","Data":"61d8e1ee54fa5ae2bb38cedfe1e6cd1c7ec76f3bd9a9db2c2ee2a46e4f6eacc8"} Dec 05 20:38:18 crc kubenswrapper[4744]: I1205 20:38:18.538973 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:18 crc kubenswrapper[4744]: I1205 20:38:18.714366 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-config-data\") pod \"3c1fb22d-2926-4e72-946d-164071db6f9a\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " Dec 05 20:38:18 crc kubenswrapper[4744]: I1205 20:38:18.714567 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-db-sync-config-data\") pod \"3c1fb22d-2926-4e72-946d-164071db6f9a\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " Dec 05 20:38:18 crc kubenswrapper[4744]: I1205 20:38:18.714646 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-combined-ca-bundle\") pod \"3c1fb22d-2926-4e72-946d-164071db6f9a\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " Dec 05 20:38:18 crc kubenswrapper[4744]: I1205 20:38:18.714694 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcwgq\" (UniqueName: \"kubernetes.io/projected/3c1fb22d-2926-4e72-946d-164071db6f9a-kube-api-access-jcwgq\") pod \"3c1fb22d-2926-4e72-946d-164071db6f9a\" (UID: \"3c1fb22d-2926-4e72-946d-164071db6f9a\") " Dec 05 20:38:18 crc kubenswrapper[4744]: I1205 20:38:18.720321 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3c1fb22d-2926-4e72-946d-164071db6f9a" (UID: "3c1fb22d-2926-4e72-946d-164071db6f9a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:18 crc kubenswrapper[4744]: I1205 20:38:18.720830 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c1fb22d-2926-4e72-946d-164071db6f9a-kube-api-access-jcwgq" (OuterVolumeSpecName: "kube-api-access-jcwgq") pod "3c1fb22d-2926-4e72-946d-164071db6f9a" (UID: "3c1fb22d-2926-4e72-946d-164071db6f9a"). InnerVolumeSpecName "kube-api-access-jcwgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:38:18 crc kubenswrapper[4744]: I1205 20:38:18.738322 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c1fb22d-2926-4e72-946d-164071db6f9a" (UID: "3c1fb22d-2926-4e72-946d-164071db6f9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:18 crc kubenswrapper[4744]: I1205 20:38:18.772196 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-config-data" (OuterVolumeSpecName: "config-data") pod "3c1fb22d-2926-4e72-946d-164071db6f9a" (UID: "3c1fb22d-2926-4e72-946d-164071db6f9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:18 crc kubenswrapper[4744]: I1205 20:38:18.816371 4744 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:18 crc kubenswrapper[4744]: I1205 20:38:18.816403 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:18 crc kubenswrapper[4744]: I1205 20:38:18.816440 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcwgq\" (UniqueName: \"kubernetes.io/projected/3c1fb22d-2926-4e72-946d-164071db6f9a-kube-api-access-jcwgq\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:18 crc kubenswrapper[4744]: I1205 20:38:18.816454 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c1fb22d-2926-4e72-946d-164071db6f9a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.118794 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" event={"ID":"3c1fb22d-2926-4e72-946d-164071db6f9a","Type":"ContainerDied","Data":"635cad6f72834f30a842f73b1872a478e5c19b1f416b0c0a20bb7e2f8e4b2e6c"} Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.119159 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="635cad6f72834f30a842f73b1872a478e5c19b1f416b0c0a20bb7e2f8e4b2e6c" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.118918 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.411801 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:38:19 crc kubenswrapper[4744]: E1205 20:38:19.412380 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c1fb22d-2926-4e72-946d-164071db6f9a" containerName="watcher-kuttl-db-sync" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.412410 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c1fb22d-2926-4e72-946d-164071db6f9a" containerName="watcher-kuttl-db-sync" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.412686 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c1fb22d-2926-4e72-946d-164071db6f9a" containerName="watcher-kuttl-db-sync" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.414346 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.422467 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.423760 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.427728 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-w6j2f" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.427976 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.428562 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.440744 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.453562 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.467448 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.469183 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.477247 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.523062 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.524222 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.526423 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.528374 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff2296b0-c0c7-438f-bb78-6822a09a99c9-logs\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.528462 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.528546 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.528568 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.528594 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9z4w\" (UniqueName: \"kubernetes.io/projected/ff2296b0-c0c7-438f-bb78-6822a09a99c9-kube-api-access-z9z4w\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.528612 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.528631 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.528649 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.528681 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.528695 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.528717 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrwjs\" (UniqueName: \"kubernetes.io/projected/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-kube-api-access-xrwjs\") pod \"watcher-kuttl-applier-0\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.543599 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629569 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8917b45b-f26b-459d-b5a3-a37c6237f112-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629610 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629628 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629647 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629666 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629686 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmrzn\" (UniqueName: \"kubernetes.io/projected/8917b45b-f26b-459d-b5a3-a37c6237f112-kube-api-access-mmrzn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629708 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629731 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrwjs\" (UniqueName: \"kubernetes.io/projected/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-kube-api-access-xrwjs\") pod \"watcher-kuttl-applier-0\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629744 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629764 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629791 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629809 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629827 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff2296b0-c0c7-438f-bb78-6822a09a99c9-logs\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629858 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629872 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629902 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629922 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629945 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9z4w\" (UniqueName: \"kubernetes.io/projected/ff2296b0-c0c7-438f-bb78-6822a09a99c9-kube-api-access-z9z4w\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629959 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629977 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0702eefa-cff3-49c1-bc88-48752724a268-logs\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.629992 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj94t\" (UniqueName: \"kubernetes.io/projected/0702eefa-cff3-49c1-bc88-48752724a268-kube-api-access-vj94t\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.630010 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.630027 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.631081 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.631397 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff2296b0-c0c7-438f-bb78-6822a09a99c9-logs\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.634696 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.635234 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.636074 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.636362 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.637453 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.644946 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.645001 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.650460 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrwjs\" (UniqueName: \"kubernetes.io/projected/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-kube-api-access-xrwjs\") pod \"watcher-kuttl-applier-0\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.662834 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9z4w\" (UniqueName: \"kubernetes.io/projected/ff2296b0-c0c7-438f-bb78-6822a09a99c9-kube-api-access-z9z4w\") pod \"watcher-kuttl-api-0\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.732019 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.732540 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0702eefa-cff3-49c1-bc88-48752724a268-logs\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.732805 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj94t\" (UniqueName: \"kubernetes.io/projected/0702eefa-cff3-49c1-bc88-48752724a268-kube-api-access-vj94t\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.732984 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0702eefa-cff3-49c1-bc88-48752724a268-logs\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.734623 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8917b45b-f26b-459d-b5a3-a37c6237f112-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.735374 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8917b45b-f26b-459d-b5a3-a37c6237f112-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.736423 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.737404 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.738751 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmrzn\" (UniqueName: \"kubernetes.io/projected/8917b45b-f26b-459d-b5a3-a37c6237f112-kube-api-access-mmrzn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.738921 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.739106 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.739356 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.739582 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.739761 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.739858 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.740916 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.743253 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.743724 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.746106 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.746338 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.747752 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.748856 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.761167 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.763204 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj94t\" (UniqueName: \"kubernetes.io/projected/0702eefa-cff3-49c1-bc88-48752724a268-kube-api-access-vj94t\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.764125 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.765881 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmrzn\" (UniqueName: \"kubernetes.io/projected/8917b45b-f26b-459d-b5a3-a37c6237f112-kube-api-access-mmrzn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.787586 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.806135 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.806173 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:38:19 crc kubenswrapper[4744]: I1205 20:38:19.870866 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:20 crc kubenswrapper[4744]: I1205 20:38:20.314139 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:38:20 crc kubenswrapper[4744]: I1205 20:38:20.389681 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:38:20 crc kubenswrapper[4744]: W1205 20:38:20.391235 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod309dd5ae_4de9_4ec1_9dc5_8d5c2cfd1a52.slice/crio-9217686e4d37b2bd01d9511ff923355820b6b06abcbe3158c6545e0e6f0d003a WatchSource:0}: Error finding container 9217686e4d37b2bd01d9511ff923355820b6b06abcbe3158c6545e0e6f0d003a: Status 404 returned error can't find the container with id 9217686e4d37b2bd01d9511ff923355820b6b06abcbe3158c6545e0e6f0d003a Dec 05 20:38:20 crc kubenswrapper[4744]: I1205 20:38:20.400723 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 20:38:20 crc kubenswrapper[4744]: I1205 20:38:20.478375 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.145061 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"0702eefa-cff3-49c1-bc88-48752724a268","Type":"ContainerStarted","Data":"770c0dbf6960462e5b55187b0311cbb3ce8aaf5a269e55ee310679a2043bbf53"} Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.145415 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"0702eefa-cff3-49c1-bc88-48752724a268","Type":"ContainerStarted","Data":"456bd883a5fecbae4a4da06791067f6b57626a788541db7a3d2dcb2919ace80b"} Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.145425 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"0702eefa-cff3-49c1-bc88-48752724a268","Type":"ContainerStarted","Data":"40cedc4883e653195f4070fd46f6f24909ef2f4ec0e92453b5b271b4356e2003"} Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.147377 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.152129 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52","Type":"ContainerStarted","Data":"0bcf6bb7c1a1ad47c5cc62fdff3c38853496e0359cb6409d7568a7f115d1ee22"} Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.152173 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52","Type":"ContainerStarted","Data":"9217686e4d37b2bd01d9511ff923355820b6b06abcbe3158c6545e0e6f0d003a"} Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.154931 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"8917b45b-f26b-459d-b5a3-a37c6237f112","Type":"ContainerStarted","Data":"d12de4ffe9019684f3ce1463cf1dfc935901530c69b301735b3e53edc2867a47"} Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.154967 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"8917b45b-f26b-459d-b5a3-a37c6237f112","Type":"ContainerStarted","Data":"fc8a0a97c603689ae23587d6604b8c1ac1a0c520281cfb7a770bb2a8b9f943e7"} Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.160113 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff2296b0-c0c7-438f-bb78-6822a09a99c9","Type":"ContainerStarted","Data":"954d6cee9a7894117b849451cb7da52d12047e8b6d5f7d979fc6675a7b6f6670"} Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.160145 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff2296b0-c0c7-438f-bb78-6822a09a99c9","Type":"ContainerStarted","Data":"d47ddcbdb724dfc894a20117962245e1ce9dd721921887429c95a481efd1e07a"} Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.160154 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff2296b0-c0c7-438f-bb78-6822a09a99c9","Type":"ContainerStarted","Data":"bd5c20e46a20a72c1260cef921f23466506f0c74a85f4023beea3b4bcd010361"} Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.160875 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.167376 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-1" podStartSLOduration=2.167360189 podStartE2EDuration="2.167360189s" podCreationTimestamp="2025-12-05 20:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:38:21.162135681 +0000 UTC m=+1671.391947059" watchObservedRunningTime="2025-12-05 20:38:21.167360189 +0000 UTC m=+1671.397171557" Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.185163 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.185142765 podStartE2EDuration="2.185142765s" podCreationTimestamp="2025-12-05 20:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:38:21.17877087 +0000 UTC m=+1671.408582238" watchObservedRunningTime="2025-12-05 20:38:21.185142765 +0000 UTC m=+1671.414954133" Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.208130 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.208112739 podStartE2EDuration="2.208112739s" podCreationTimestamp="2025-12-05 20:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:38:21.204192693 +0000 UTC m=+1671.434004061" watchObservedRunningTime="2025-12-05 20:38:21.208112739 +0000 UTC m=+1671.437924107" Dec 05 20:38:21 crc kubenswrapper[4744]: I1205 20:38:21.227912 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.227894405 podStartE2EDuration="2.227894405s" podCreationTimestamp="2025-12-05 20:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:38:21.220998806 +0000 UTC m=+1671.450810174" watchObservedRunningTime="2025-12-05 20:38:21.227894405 +0000 UTC m=+1671.457705773" Dec 05 20:38:23 crc kubenswrapper[4744]: I1205 20:38:23.173944 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:38:23 crc kubenswrapper[4744]: I1205 20:38:23.173997 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:38:23 crc kubenswrapper[4744]: I1205 20:38:23.379269 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:23 crc kubenswrapper[4744]: I1205 20:38:23.732744 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:24 crc kubenswrapper[4744]: I1205 20:38:24.749330 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:24 crc kubenswrapper[4744]: I1205 20:38:24.761867 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:24 crc kubenswrapper[4744]: I1205 20:38:24.788073 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:29 crc kubenswrapper[4744]: I1205 20:38:29.750128 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:29 crc kubenswrapper[4744]: I1205 20:38:29.759234 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:29 crc kubenswrapper[4744]: I1205 20:38:29.762460 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:29 crc kubenswrapper[4744]: I1205 20:38:29.788757 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:29 crc kubenswrapper[4744]: I1205 20:38:29.797675 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:29 crc kubenswrapper[4744]: I1205 20:38:29.801442 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:29 crc kubenswrapper[4744]: I1205 20:38:29.872114 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:29 crc kubenswrapper[4744]: I1205 20:38:29.893651 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:30 crc kubenswrapper[4744]: I1205 20:38:30.235519 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:30 crc kubenswrapper[4744]: I1205 20:38:30.242051 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:30 crc kubenswrapper[4744]: I1205 20:38:30.242356 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:30 crc kubenswrapper[4744]: I1205 20:38:30.262624 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:38:30 crc kubenswrapper[4744]: I1205 20:38:30.272862 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:38:32 crc kubenswrapper[4744]: I1205 20:38:32.446015 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:38:32 crc kubenswrapper[4744]: I1205 20:38:32.446629 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="ceilometer-central-agent" containerID="cri-o://fbfc1e53669be73a0da327f7d69dbd5783cfc751de5d4fee06e32c0a7b44ceca" gracePeriod=30 Dec 05 20:38:32 crc kubenswrapper[4744]: I1205 20:38:32.446684 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="sg-core" containerID="cri-o://101021aea0b7d201da2eba15934e06f7a98a6d9c5c5d29ece5c696525da88a77" gracePeriod=30 Dec 05 20:38:32 crc kubenswrapper[4744]: I1205 20:38:32.446738 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="proxy-httpd" containerID="cri-o://ace9d6d3047d0c9f8de74e1e02c756498ad63a7297d60895865fa2e272ba5b47" gracePeriod=30 Dec 05 20:38:32 crc kubenswrapper[4744]: I1205 20:38:32.447975 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="ceilometer-notification-agent" containerID="cri-o://ca4e53d9dcdaf7f152a6991d5ee747556ca7a1e66c15fe48cb5ac1999527ef07" gracePeriod=30 Dec 05 20:38:32 crc kubenswrapper[4744]: I1205 20:38:32.457788 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.178:3000/\": EOF" Dec 05 20:38:33 crc kubenswrapper[4744]: I1205 20:38:33.272730 4744 generic.go:334] "Generic (PLEG): container finished" podID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerID="ace9d6d3047d0c9f8de74e1e02c756498ad63a7297d60895865fa2e272ba5b47" exitCode=0 Dec 05 20:38:33 crc kubenswrapper[4744]: I1205 20:38:33.272762 4744 generic.go:334] "Generic (PLEG): container finished" podID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerID="101021aea0b7d201da2eba15934e06f7a98a6d9c5c5d29ece5c696525da88a77" exitCode=2 Dec 05 20:38:33 crc kubenswrapper[4744]: I1205 20:38:33.272769 4744 generic.go:334] "Generic (PLEG): container finished" podID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerID="fbfc1e53669be73a0da327f7d69dbd5783cfc751de5d4fee06e32c0a7b44ceca" exitCode=0 Dec 05 20:38:33 crc kubenswrapper[4744]: I1205 20:38:33.272789 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2ecbe775-7079-493d-93f5-b7f1e34e74a7","Type":"ContainerDied","Data":"ace9d6d3047d0c9f8de74e1e02c756498ad63a7297d60895865fa2e272ba5b47"} Dec 05 20:38:33 crc kubenswrapper[4744]: I1205 20:38:33.272816 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2ecbe775-7079-493d-93f5-b7f1e34e74a7","Type":"ContainerDied","Data":"101021aea0b7d201da2eba15934e06f7a98a6d9c5c5d29ece5c696525da88a77"} Dec 05 20:38:33 crc kubenswrapper[4744]: I1205 20:38:33.272826 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2ecbe775-7079-493d-93f5-b7f1e34e74a7","Type":"ContainerDied","Data":"fbfc1e53669be73a0da327f7d69dbd5783cfc751de5d4fee06e32c0a7b44ceca"} Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.285198 4744 generic.go:334] "Generic (PLEG): container finished" podID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerID="ca4e53d9dcdaf7f152a6991d5ee747556ca7a1e66c15fe48cb5ac1999527ef07" exitCode=0 Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.285512 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2ecbe775-7079-493d-93f5-b7f1e34e74a7","Type":"ContainerDied","Data":"ca4e53d9dcdaf7f152a6991d5ee747556ca7a1e66c15fe48cb5ac1999527ef07"} Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.447585 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.526764 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-879tt\" (UniqueName: \"kubernetes.io/projected/2ecbe775-7079-493d-93f5-b7f1e34e74a7-kube-api-access-879tt\") pod \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.526813 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-combined-ca-bundle\") pod \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.526843 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-sg-core-conf-yaml\") pod \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.526938 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-ceilometer-tls-certs\") pod \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.526963 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ecbe775-7079-493d-93f5-b7f1e34e74a7-log-httpd\") pod \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.526996 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-config-data\") pod \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.527030 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-scripts\") pod \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.527060 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ecbe775-7079-493d-93f5-b7f1e34e74a7-run-httpd\") pod \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.528625 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ecbe775-7079-493d-93f5-b7f1e34e74a7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2ecbe775-7079-493d-93f5-b7f1e34e74a7" (UID: "2ecbe775-7079-493d-93f5-b7f1e34e74a7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.528944 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ecbe775-7079-493d-93f5-b7f1e34e74a7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2ecbe775-7079-493d-93f5-b7f1e34e74a7" (UID: "2ecbe775-7079-493d-93f5-b7f1e34e74a7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.566227 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-scripts" (OuterVolumeSpecName: "scripts") pod "2ecbe775-7079-493d-93f5-b7f1e34e74a7" (UID: "2ecbe775-7079-493d-93f5-b7f1e34e74a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.570560 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ecbe775-7079-493d-93f5-b7f1e34e74a7-kube-api-access-879tt" (OuterVolumeSpecName: "kube-api-access-879tt") pod "2ecbe775-7079-493d-93f5-b7f1e34e74a7" (UID: "2ecbe775-7079-493d-93f5-b7f1e34e74a7"). InnerVolumeSpecName "kube-api-access-879tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.606532 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2ecbe775-7079-493d-93f5-b7f1e34e74a7" (UID: "2ecbe775-7079-493d-93f5-b7f1e34e74a7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.628944 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ecbe775-7079-493d-93f5-b7f1e34e74a7" (UID: "2ecbe775-7079-493d-93f5-b7f1e34e74a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.629521 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-combined-ca-bundle\") pod \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\" (UID: \"2ecbe775-7079-493d-93f5-b7f1e34e74a7\") " Dec 05 20:38:34 crc kubenswrapper[4744]: W1205 20:38:34.629653 4744 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2ecbe775-7079-493d-93f5-b7f1e34e74a7/volumes/kubernetes.io~secret/combined-ca-bundle Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.629675 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ecbe775-7079-493d-93f5-b7f1e34e74a7" (UID: "2ecbe775-7079-493d-93f5-b7f1e34e74a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.630004 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-879tt\" (UniqueName: \"kubernetes.io/projected/2ecbe775-7079-493d-93f5-b7f1e34e74a7-kube-api-access-879tt\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.630033 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.630053 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.630070 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ecbe775-7079-493d-93f5-b7f1e34e74a7-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.630088 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.630096 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ecbe775-7079-493d-93f5-b7f1e34e74a7-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.635710 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2ecbe775-7079-493d-93f5-b7f1e34e74a7" (UID: "2ecbe775-7079-493d-93f5-b7f1e34e74a7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.650592 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-config-data" (OuterVolumeSpecName: "config-data") pod "2ecbe775-7079-493d-93f5-b7f1e34e74a7" (UID: "2ecbe775-7079-493d-93f5-b7f1e34e74a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.731657 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:34 crc kubenswrapper[4744]: I1205 20:38:34.731683 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ecbe775-7079-493d-93f5-b7f1e34e74a7-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.082976 4744 scope.go:117] "RemoveContainer" containerID="eedd081d5a2588b462850249437846282ad7b3ebf125c6a4ec001312d4153d7b" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.195339 4744 scope.go:117] "RemoveContainer" containerID="dfd8a7176d2d1bbc74e20bf321b3ff236d1d9b2f3769b9708022eabed666113c" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.296806 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2ecbe775-7079-493d-93f5-b7f1e34e74a7","Type":"ContainerDied","Data":"c2e1d1372c3b6572abdcf262b3334d66809fc5230685d4fc7c2ca8f329127491"} Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.296859 4744 scope.go:117] "RemoveContainer" containerID="ace9d6d3047d0c9f8de74e1e02c756498ad63a7297d60895865fa2e272ba5b47" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.296964 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.335096 4744 scope.go:117] "RemoveContainer" containerID="101021aea0b7d201da2eba15934e06f7a98a6d9c5c5d29ece5c696525da88a77" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.339670 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.360226 4744 scope.go:117] "RemoveContainer" containerID="ca4e53d9dcdaf7f152a6991d5ee747556ca7a1e66c15fe48cb5ac1999527ef07" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.360379 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.369428 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:38:35 crc kubenswrapper[4744]: E1205 20:38:35.369997 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="proxy-httpd" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.370020 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="proxy-httpd" Dec 05 20:38:35 crc kubenswrapper[4744]: E1205 20:38:35.370035 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="ceilometer-central-agent" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.370047 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="ceilometer-central-agent" Dec 05 20:38:35 crc kubenswrapper[4744]: E1205 20:38:35.370078 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="sg-core" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.370088 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="sg-core" Dec 05 20:38:35 crc kubenswrapper[4744]: E1205 20:38:35.370110 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="ceilometer-notification-agent" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.370119 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="ceilometer-notification-agent" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.370340 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="ceilometer-central-agent" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.370363 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="ceilometer-notification-agent" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.370377 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="proxy-httpd" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.370403 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" containerName="sg-core" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.375954 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.380903 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.381086 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.382512 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.382662 4744 scope.go:117] "RemoveContainer" containerID="fbfc1e53669be73a0da327f7d69dbd5783cfc751de5d4fee06e32c0a7b44ceca" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.384501 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.544121 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-scripts\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.544162 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-config-data\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.544181 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-log-httpd\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.544202 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-run-httpd\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.544362 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.544812 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.544943 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dtkw\" (UniqueName: \"kubernetes.io/projected/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-kube-api-access-7dtkw\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.545006 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.646930 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-scripts\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.647024 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-config-data\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.647078 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-log-httpd\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.647118 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-run-httpd\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.647214 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.647561 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.647636 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dtkw\" (UniqueName: \"kubernetes.io/projected/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-kube-api-access-7dtkw\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.647704 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.647970 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-log-httpd\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.648062 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-run-httpd\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.651233 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.652072 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-scripts\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.652359 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.652783 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-config-data\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.671607 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.674103 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dtkw\" (UniqueName: \"kubernetes.io/projected/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-kube-api-access-7dtkw\") pod \"ceilometer-0\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:35 crc kubenswrapper[4744]: I1205 20:38:35.701199 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:36 crc kubenswrapper[4744]: I1205 20:38:36.090497 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ecbe775-7079-493d-93f5-b7f1e34e74a7" path="/var/lib/kubelet/pods/2ecbe775-7079-493d-93f5-b7f1e34e74a7/volumes" Dec 05 20:38:36 crc kubenswrapper[4744]: I1205 20:38:36.272393 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:38:36 crc kubenswrapper[4744]: I1205 20:38:36.310246 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f0215aeb-42a9-4032-95fc-cb5a9389ddd3","Type":"ContainerStarted","Data":"fd2e2057e45f73ef25e4371994b30689f0faef8e87e524b697413cce06d87101"} Dec 05 20:38:37 crc kubenswrapper[4744]: I1205 20:38:37.320523 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f0215aeb-42a9-4032-95fc-cb5a9389ddd3","Type":"ContainerStarted","Data":"c5bb4b4629df53111760f03c0a359a03c8c3565b4fbcd0683135ce85c1f17874"} Dec 05 20:38:38 crc kubenswrapper[4744]: I1205 20:38:38.335046 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f0215aeb-42a9-4032-95fc-cb5a9389ddd3","Type":"ContainerStarted","Data":"3950d8fd46870f54d6706f6ff9fdfb88f979e4f06fc8fed1e1675cc301931a54"} Dec 05 20:38:38 crc kubenswrapper[4744]: I1205 20:38:38.335496 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f0215aeb-42a9-4032-95fc-cb5a9389ddd3","Type":"ContainerStarted","Data":"7effb828d86c405022713afb41f36c08d7c197f0d81dd070b4dcd12d210fc744"} Dec 05 20:38:40 crc kubenswrapper[4744]: I1205 20:38:40.355403 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f0215aeb-42a9-4032-95fc-cb5a9389ddd3","Type":"ContainerStarted","Data":"5c205e0f71ec4d9227f2522d210f8b3f49a1a994b8b48786f45dbdd796d502d7"} Dec 05 20:38:40 crc kubenswrapper[4744]: I1205 20:38:40.355823 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:38:40 crc kubenswrapper[4744]: I1205 20:38:40.381369 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.277533191 podStartE2EDuration="5.381351658s" podCreationTimestamp="2025-12-05 20:38:35 +0000 UTC" firstStartedPulling="2025-12-05 20:38:36.280852597 +0000 UTC m=+1686.510663965" lastFinishedPulling="2025-12-05 20:38:39.384671064 +0000 UTC m=+1689.614482432" observedRunningTime="2025-12-05 20:38:40.374386093 +0000 UTC m=+1690.604197461" watchObservedRunningTime="2025-12-05 20:38:40.381351658 +0000 UTC m=+1690.611163026" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.531655 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.533694 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.536807 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.661360 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-custom-prometheus-ca\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.661427 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ld9s\" (UniqueName: \"kubernetes.io/projected/09334492-8446-449e-adbb-4866a44e850f-kube-api-access-7ld9s\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.661453 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-cert-memcached-mtls\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.661490 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-combined-ca-bundle\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.661821 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-config-data\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.661880 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09334492-8446-449e-adbb-4866a44e850f-logs\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.763832 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-combined-ca-bundle\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.763943 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-config-data\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.763962 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09334492-8446-449e-adbb-4866a44e850f-logs\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.764009 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-custom-prometheus-ca\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.764048 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ld9s\" (UniqueName: \"kubernetes.io/projected/09334492-8446-449e-adbb-4866a44e850f-kube-api-access-7ld9s\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.764070 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-cert-memcached-mtls\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.764741 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09334492-8446-449e-adbb-4866a44e850f-logs\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.770177 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-combined-ca-bundle\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.770250 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-custom-prometheus-ca\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.770401 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-config-data\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.771735 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-cert-memcached-mtls\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.782875 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ld9s\" (UniqueName: \"kubernetes.io/projected/09334492-8446-449e-adbb-4866a44e850f-kube-api-access-7ld9s\") pod \"watcher-kuttl-api-2\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:41 crc kubenswrapper[4744]: I1205 20:38:41.861917 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:42 crc kubenswrapper[4744]: I1205 20:38:42.354330 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Dec 05 20:38:42 crc kubenswrapper[4744]: I1205 20:38:42.381685 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"09334492-8446-449e-adbb-4866a44e850f","Type":"ContainerStarted","Data":"0adc6b1534a67394afd1ce170b1b58b253f08917ccd3ad8bcc1500c653de9b58"} Dec 05 20:38:43 crc kubenswrapper[4744]: I1205 20:38:43.390985 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"09334492-8446-449e-adbb-4866a44e850f","Type":"ContainerStarted","Data":"c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129"} Dec 05 20:38:43 crc kubenswrapper[4744]: I1205 20:38:43.392324 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"09334492-8446-449e-adbb-4866a44e850f","Type":"ContainerStarted","Data":"3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923"} Dec 05 20:38:43 crc kubenswrapper[4744]: I1205 20:38:43.394347 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:43 crc kubenswrapper[4744]: I1205 20:38:43.429974 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-2" podStartSLOduration=2.42995501 podStartE2EDuration="2.42995501s" podCreationTimestamp="2025-12-05 20:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:38:43.421160334 +0000 UTC m=+1693.650971712" watchObservedRunningTime="2025-12-05 20:38:43.42995501 +0000 UTC m=+1693.659766378" Dec 05 20:38:45 crc kubenswrapper[4744]: I1205 20:38:45.415304 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:38:45 crc kubenswrapper[4744]: I1205 20:38:45.666719 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:46 crc kubenswrapper[4744]: I1205 20:38:46.862982 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:49 crc kubenswrapper[4744]: I1205 20:38:49.806746 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:38:49 crc kubenswrapper[4744]: I1205 20:38:49.807493 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:38:49 crc kubenswrapper[4744]: I1205 20:38:49.807583 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:38:49 crc kubenswrapper[4744]: I1205 20:38:49.808664 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921"} pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:38:49 crc kubenswrapper[4744]: I1205 20:38:49.808758 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" containerID="cri-o://0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" gracePeriod=600 Dec 05 20:38:49 crc kubenswrapper[4744]: E1205 20:38:49.931902 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:38:50 crc kubenswrapper[4744]: I1205 20:38:50.484651 4744 generic.go:334] "Generic (PLEG): container finished" podID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" exitCode=0 Dec 05 20:38:50 crc kubenswrapper[4744]: I1205 20:38:50.484711 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerDied","Data":"0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921"} Dec 05 20:38:50 crc kubenswrapper[4744]: I1205 20:38:50.484751 4744 scope.go:117] "RemoveContainer" containerID="a41e1afd711ac794442abac71b281086d9f7a27b011779b1513b0d659dd4277c" Dec 05 20:38:50 crc kubenswrapper[4744]: I1205 20:38:50.485601 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:38:50 crc kubenswrapper[4744]: E1205 20:38:50.485871 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:38:51 crc kubenswrapper[4744]: I1205 20:38:51.862752 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:51 crc kubenswrapper[4744]: I1205 20:38:51.869167 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:52 crc kubenswrapper[4744]: I1205 20:38:52.512059 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:53 crc kubenswrapper[4744]: I1205 20:38:53.247638 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Dec 05 20:38:53 crc kubenswrapper[4744]: I1205 20:38:53.262920 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 20:38:53 crc kubenswrapper[4744]: I1205 20:38:53.263172 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="0702eefa-cff3-49c1-bc88-48752724a268" containerName="watcher-kuttl-api-log" containerID="cri-o://456bd883a5fecbae4a4da06791067f6b57626a788541db7a3d2dcb2919ace80b" gracePeriod=30 Dec 05 20:38:53 crc kubenswrapper[4744]: I1205 20:38:53.263345 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="0702eefa-cff3-49c1-bc88-48752724a268" containerName="watcher-api" containerID="cri-o://770c0dbf6960462e5b55187b0311cbb3ce8aaf5a269e55ee310679a2043bbf53" gracePeriod=30 Dec 05 20:38:53 crc kubenswrapper[4744]: I1205 20:38:53.513898 4744 generic.go:334] "Generic (PLEG): container finished" podID="0702eefa-cff3-49c1-bc88-48752724a268" containerID="456bd883a5fecbae4a4da06791067f6b57626a788541db7a3d2dcb2919ace80b" exitCode=143 Dec 05 20:38:53 crc kubenswrapper[4744]: I1205 20:38:53.514145 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"0702eefa-cff3-49c1-bc88-48752724a268","Type":"ContainerDied","Data":"456bd883a5fecbae4a4da06791067f6b57626a788541db7a3d2dcb2919ace80b"} Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.145271 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.270398 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-custom-prometheus-ca\") pod \"0702eefa-cff3-49c1-bc88-48752724a268\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.270539 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0702eefa-cff3-49c1-bc88-48752724a268-logs\") pod \"0702eefa-cff3-49c1-bc88-48752724a268\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.270559 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-combined-ca-bundle\") pod \"0702eefa-cff3-49c1-bc88-48752724a268\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.270592 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj94t\" (UniqueName: \"kubernetes.io/projected/0702eefa-cff3-49c1-bc88-48752724a268-kube-api-access-vj94t\") pod \"0702eefa-cff3-49c1-bc88-48752724a268\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.270691 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-cert-memcached-mtls\") pod \"0702eefa-cff3-49c1-bc88-48752724a268\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.270730 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-config-data\") pod \"0702eefa-cff3-49c1-bc88-48752724a268\" (UID: \"0702eefa-cff3-49c1-bc88-48752724a268\") " Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.271789 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0702eefa-cff3-49c1-bc88-48752724a268-logs" (OuterVolumeSpecName: "logs") pod "0702eefa-cff3-49c1-bc88-48752724a268" (UID: "0702eefa-cff3-49c1-bc88-48752724a268"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.284500 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0702eefa-cff3-49c1-bc88-48752724a268-kube-api-access-vj94t" (OuterVolumeSpecName: "kube-api-access-vj94t") pod "0702eefa-cff3-49c1-bc88-48752724a268" (UID: "0702eefa-cff3-49c1-bc88-48752724a268"). InnerVolumeSpecName "kube-api-access-vj94t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.302744 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0702eefa-cff3-49c1-bc88-48752724a268" (UID: "0702eefa-cff3-49c1-bc88-48752724a268"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.308476 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "0702eefa-cff3-49c1-bc88-48752724a268" (UID: "0702eefa-cff3-49c1-bc88-48752724a268"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.314313 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-config-data" (OuterVolumeSpecName: "config-data") pod "0702eefa-cff3-49c1-bc88-48752724a268" (UID: "0702eefa-cff3-49c1-bc88-48752724a268"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.331411 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "0702eefa-cff3-49c1-bc88-48752724a268" (UID: "0702eefa-cff3-49c1-bc88-48752724a268"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.372652 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.372695 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0702eefa-cff3-49c1-bc88-48752724a268-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.372709 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj94t\" (UniqueName: \"kubernetes.io/projected/0702eefa-cff3-49c1-bc88-48752724a268-kube-api-access-vj94t\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.372722 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.372734 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.372744 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0702eefa-cff3-49c1-bc88-48752724a268-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.523266 4744 generic.go:334] "Generic (PLEG): container finished" podID="0702eefa-cff3-49c1-bc88-48752724a268" containerID="770c0dbf6960462e5b55187b0311cbb3ce8aaf5a269e55ee310679a2043bbf53" exitCode=0 Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.523324 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"0702eefa-cff3-49c1-bc88-48752724a268","Type":"ContainerDied","Data":"770c0dbf6960462e5b55187b0311cbb3ce8aaf5a269e55ee310679a2043bbf53"} Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.523536 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"0702eefa-cff3-49c1-bc88-48752724a268","Type":"ContainerDied","Data":"40cedc4883e653195f4070fd46f6f24909ef2f4ec0e92453b5b271b4356e2003"} Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.523562 4744 scope.go:117] "RemoveContainer" containerID="770c0dbf6960462e5b55187b0311cbb3ce8aaf5a269e55ee310679a2043bbf53" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.523345 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.524092 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-2" podUID="09334492-8446-449e-adbb-4866a44e850f" containerName="watcher-kuttl-api-log" containerID="cri-o://3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923" gracePeriod=30 Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.524107 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-2" podUID="09334492-8446-449e-adbb-4866a44e850f" containerName="watcher-api" containerID="cri-o://c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129" gracePeriod=30 Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.547487 4744 scope.go:117] "RemoveContainer" containerID="456bd883a5fecbae4a4da06791067f6b57626a788541db7a3d2dcb2919ace80b" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.558451 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.570740 4744 scope.go:117] "RemoveContainer" containerID="770c0dbf6960462e5b55187b0311cbb3ce8aaf5a269e55ee310679a2043bbf53" Dec 05 20:38:54 crc kubenswrapper[4744]: E1205 20:38:54.571812 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"770c0dbf6960462e5b55187b0311cbb3ce8aaf5a269e55ee310679a2043bbf53\": container with ID starting with 770c0dbf6960462e5b55187b0311cbb3ce8aaf5a269e55ee310679a2043bbf53 not found: ID does not exist" containerID="770c0dbf6960462e5b55187b0311cbb3ce8aaf5a269e55ee310679a2043bbf53" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.571853 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"770c0dbf6960462e5b55187b0311cbb3ce8aaf5a269e55ee310679a2043bbf53"} err="failed to get container status \"770c0dbf6960462e5b55187b0311cbb3ce8aaf5a269e55ee310679a2043bbf53\": rpc error: code = NotFound desc = could not find container \"770c0dbf6960462e5b55187b0311cbb3ce8aaf5a269e55ee310679a2043bbf53\": container with ID starting with 770c0dbf6960462e5b55187b0311cbb3ce8aaf5a269e55ee310679a2043bbf53 not found: ID does not exist" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.571882 4744 scope.go:117] "RemoveContainer" containerID="456bd883a5fecbae4a4da06791067f6b57626a788541db7a3d2dcb2919ace80b" Dec 05 20:38:54 crc kubenswrapper[4744]: E1205 20:38:54.572137 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456bd883a5fecbae4a4da06791067f6b57626a788541db7a3d2dcb2919ace80b\": container with ID starting with 456bd883a5fecbae4a4da06791067f6b57626a788541db7a3d2dcb2919ace80b not found: ID does not exist" containerID="456bd883a5fecbae4a4da06791067f6b57626a788541db7a3d2dcb2919ace80b" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.572167 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456bd883a5fecbae4a4da06791067f6b57626a788541db7a3d2dcb2919ace80b"} err="failed to get container status \"456bd883a5fecbae4a4da06791067f6b57626a788541db7a3d2dcb2919ace80b\": rpc error: code = NotFound desc = could not find container \"456bd883a5fecbae4a4da06791067f6b57626a788541db7a3d2dcb2919ace80b\": container with ID starting with 456bd883a5fecbae4a4da06791067f6b57626a788541db7a3d2dcb2919ace80b not found: ID does not exist" Dec 05 20:38:54 crc kubenswrapper[4744]: I1205 20:38:54.574031 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.383685 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.488606 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-cert-memcached-mtls\") pod \"09334492-8446-449e-adbb-4866a44e850f\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.489001 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-config-data\") pod \"09334492-8446-449e-adbb-4866a44e850f\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.489027 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ld9s\" (UniqueName: \"kubernetes.io/projected/09334492-8446-449e-adbb-4866a44e850f-kube-api-access-7ld9s\") pod \"09334492-8446-449e-adbb-4866a44e850f\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.489052 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-combined-ca-bundle\") pod \"09334492-8446-449e-adbb-4866a44e850f\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.489119 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09334492-8446-449e-adbb-4866a44e850f-logs\") pod \"09334492-8446-449e-adbb-4866a44e850f\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.489179 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-custom-prometheus-ca\") pod \"09334492-8446-449e-adbb-4866a44e850f\" (UID: \"09334492-8446-449e-adbb-4866a44e850f\") " Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.508846 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09334492-8446-449e-adbb-4866a44e850f-logs" (OuterVolumeSpecName: "logs") pod "09334492-8446-449e-adbb-4866a44e850f" (UID: "09334492-8446-449e-adbb-4866a44e850f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.520443 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09334492-8446-449e-adbb-4866a44e850f-kube-api-access-7ld9s" (OuterVolumeSpecName: "kube-api-access-7ld9s") pod "09334492-8446-449e-adbb-4866a44e850f" (UID: "09334492-8446-449e-adbb-4866a44e850f"). InnerVolumeSpecName "kube-api-access-7ld9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.554178 4744 generic.go:334] "Generic (PLEG): container finished" podID="09334492-8446-449e-adbb-4866a44e850f" containerID="c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129" exitCode=0 Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.554219 4744 generic.go:334] "Generic (PLEG): container finished" podID="09334492-8446-449e-adbb-4866a44e850f" containerID="3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923" exitCode=143 Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.554271 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"09334492-8446-449e-adbb-4866a44e850f","Type":"ContainerDied","Data":"c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129"} Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.554319 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"09334492-8446-449e-adbb-4866a44e850f","Type":"ContainerDied","Data":"3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923"} Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.554343 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"09334492-8446-449e-adbb-4866a44e850f","Type":"ContainerDied","Data":"0adc6b1534a67394afd1ce170b1b58b253f08917ccd3ad8bcc1500c653de9b58"} Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.554361 4744 scope.go:117] "RemoveContainer" containerID="c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.554472 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.555380 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "09334492-8446-449e-adbb-4866a44e850f" (UID: "09334492-8446-449e-adbb-4866a44e850f"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.577942 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09334492-8446-449e-adbb-4866a44e850f" (UID: "09334492-8446-449e-adbb-4866a44e850f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.591182 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09334492-8446-449e-adbb-4866a44e850f-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.591210 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.591221 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ld9s\" (UniqueName: \"kubernetes.io/projected/09334492-8446-449e-adbb-4866a44e850f-kube-api-access-7ld9s\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.591229 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.604355 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-config-data" (OuterVolumeSpecName: "config-data") pod "09334492-8446-449e-adbb-4866a44e850f" (UID: "09334492-8446-449e-adbb-4866a44e850f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.617125 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "09334492-8446-449e-adbb-4866a44e850f" (UID: "09334492-8446-449e-adbb-4866a44e850f"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.682452 4744 scope.go:117] "RemoveContainer" containerID="3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.692516 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.692549 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09334492-8446-449e-adbb-4866a44e850f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.699033 4744 scope.go:117] "RemoveContainer" containerID="c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129" Dec 05 20:38:55 crc kubenswrapper[4744]: E1205 20:38:55.699524 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129\": container with ID starting with c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129 not found: ID does not exist" containerID="c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.699559 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129"} err="failed to get container status \"c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129\": rpc error: code = NotFound desc = could not find container \"c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129\": container with ID starting with c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129 not found: ID does not exist" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.699601 4744 scope.go:117] "RemoveContainer" containerID="3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923" Dec 05 20:38:55 crc kubenswrapper[4744]: E1205 20:38:55.699945 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923\": container with ID starting with 3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923 not found: ID does not exist" containerID="3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.699973 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923"} err="failed to get container status \"3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923\": rpc error: code = NotFound desc = could not find container \"3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923\": container with ID starting with 3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923 not found: ID does not exist" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.699992 4744 scope.go:117] "RemoveContainer" containerID="c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.700215 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129"} err="failed to get container status \"c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129\": rpc error: code = NotFound desc = could not find container \"c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129\": container with ID starting with c417d2f7bc1621e03a0a2d4850e20219a3963b78d5316c32622088b3b654a129 not found: ID does not exist" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.700238 4744 scope.go:117] "RemoveContainer" containerID="3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.700478 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923"} err="failed to get container status \"3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923\": rpc error: code = NotFound desc = could not find container \"3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923\": container with ID starting with 3e8437a5d1a3c7202d821e2aa8cd0b47c20d7bd442a942ea92430eac5e3bc923 not found: ID does not exist" Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.881840 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Dec 05 20:38:55 crc kubenswrapper[4744]: I1205 20:38:55.891869 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Dec 05 20:38:56 crc kubenswrapper[4744]: I1205 20:38:56.090206 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0702eefa-cff3-49c1-bc88-48752724a268" path="/var/lib/kubelet/pods/0702eefa-cff3-49c1-bc88-48752724a268/volumes" Dec 05 20:38:56 crc kubenswrapper[4744]: I1205 20:38:56.090858 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09334492-8446-449e-adbb-4866a44e850f" path="/var/lib/kubelet/pods/09334492-8446-449e-adbb-4866a44e850f/volumes" Dec 05 20:38:56 crc kubenswrapper[4744]: I1205 20:38:56.558551 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:38:56 crc kubenswrapper[4744]: I1205 20:38:56.558814 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="ff2296b0-c0c7-438f-bb78-6822a09a99c9" containerName="watcher-kuttl-api-log" containerID="cri-o://d47ddcbdb724dfc894a20117962245e1ce9dd721921887429c95a481efd1e07a" gracePeriod=30 Dec 05 20:38:56 crc kubenswrapper[4744]: I1205 20:38:56.558888 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="ff2296b0-c0c7-438f-bb78-6822a09a99c9" containerName="watcher-api" containerID="cri-o://954d6cee9a7894117b849451cb7da52d12047e8b6d5f7d979fc6675a7b6f6670" gracePeriod=30 Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.597455 4744 generic.go:334] "Generic (PLEG): container finished" podID="ff2296b0-c0c7-438f-bb78-6822a09a99c9" containerID="954d6cee9a7894117b849451cb7da52d12047e8b6d5f7d979fc6675a7b6f6670" exitCode=0 Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.598058 4744 generic.go:334] "Generic (PLEG): container finished" podID="ff2296b0-c0c7-438f-bb78-6822a09a99c9" containerID="d47ddcbdb724dfc894a20117962245e1ce9dd721921887429c95a481efd1e07a" exitCode=143 Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.597532 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff2296b0-c0c7-438f-bb78-6822a09a99c9","Type":"ContainerDied","Data":"954d6cee9a7894117b849451cb7da52d12047e8b6d5f7d979fc6675a7b6f6670"} Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.598129 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff2296b0-c0c7-438f-bb78-6822a09a99c9","Type":"ContainerDied","Data":"d47ddcbdb724dfc894a20117962245e1ce9dd721921887429c95a481efd1e07a"} Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.726479 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm"] Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.736849 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zp8zm"] Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.762649 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.782046 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher58b3-account-delete-9jsr6"] Dec 05 20:38:57 crc kubenswrapper[4744]: E1205 20:38:57.782486 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff2296b0-c0c7-438f-bb78-6822a09a99c9" containerName="watcher-api" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.782513 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2296b0-c0c7-438f-bb78-6822a09a99c9" containerName="watcher-api" Dec 05 20:38:57 crc kubenswrapper[4744]: E1205 20:38:57.782559 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09334492-8446-449e-adbb-4866a44e850f" containerName="watcher-api" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.782571 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="09334492-8446-449e-adbb-4866a44e850f" containerName="watcher-api" Dec 05 20:38:57 crc kubenswrapper[4744]: E1205 20:38:57.782593 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09334492-8446-449e-adbb-4866a44e850f" containerName="watcher-kuttl-api-log" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.782602 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="09334492-8446-449e-adbb-4866a44e850f" containerName="watcher-kuttl-api-log" Dec 05 20:38:57 crc kubenswrapper[4744]: E1205 20:38:57.782622 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0702eefa-cff3-49c1-bc88-48752724a268" containerName="watcher-kuttl-api-log" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.782630 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0702eefa-cff3-49c1-bc88-48752724a268" containerName="watcher-kuttl-api-log" Dec 05 20:38:57 crc kubenswrapper[4744]: E1205 20:38:57.782646 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff2296b0-c0c7-438f-bb78-6822a09a99c9" containerName="watcher-kuttl-api-log" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.782654 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2296b0-c0c7-438f-bb78-6822a09a99c9" containerName="watcher-kuttl-api-log" Dec 05 20:38:57 crc kubenswrapper[4744]: E1205 20:38:57.782666 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0702eefa-cff3-49c1-bc88-48752724a268" containerName="watcher-api" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.782673 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0702eefa-cff3-49c1-bc88-48752724a268" containerName="watcher-api" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.782877 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff2296b0-c0c7-438f-bb78-6822a09a99c9" containerName="watcher-api" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.782894 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="09334492-8446-449e-adbb-4866a44e850f" containerName="watcher-kuttl-api-log" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.782903 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="09334492-8446-449e-adbb-4866a44e850f" containerName="watcher-api" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.782914 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0702eefa-cff3-49c1-bc88-48752724a268" containerName="watcher-api" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.782932 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff2296b0-c0c7-438f-bb78-6822a09a99c9" containerName="watcher-kuttl-api-log" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.782944 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0702eefa-cff3-49c1-bc88-48752724a268" containerName="watcher-kuttl-api-log" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.783654 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher58b3-account-delete-9jsr6" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.790384 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher58b3-account-delete-9jsr6"] Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.825336 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-config-data\") pod \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.825399 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-cert-memcached-mtls\") pod \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.825426 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff2296b0-c0c7-438f-bb78-6822a09a99c9-logs\") pod \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.825454 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-combined-ca-bundle\") pod \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.825607 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-custom-prometheus-ca\") pod \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.825631 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9z4w\" (UniqueName: \"kubernetes.io/projected/ff2296b0-c0c7-438f-bb78-6822a09a99c9-kube-api-access-z9z4w\") pod \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\" (UID: \"ff2296b0-c0c7-438f-bb78-6822a09a99c9\") " Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.838752 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.844122 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff2296b0-c0c7-438f-bb78-6822a09a99c9-logs" (OuterVolumeSpecName: "logs") pod "ff2296b0-c0c7-438f-bb78-6822a09a99c9" (UID: "ff2296b0-c0c7-438f-bb78-6822a09a99c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.845946 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="8917b45b-f26b-459d-b5a3-a37c6237f112" containerName="watcher-decision-engine" containerID="cri-o://d12de4ffe9019684f3ce1463cf1dfc935901530c69b301735b3e53edc2867a47" gracePeriod=30 Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.879738 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff2296b0-c0c7-438f-bb78-6822a09a99c9-kube-api-access-z9z4w" (OuterVolumeSpecName: "kube-api-access-z9z4w") pod "ff2296b0-c0c7-438f-bb78-6822a09a99c9" (UID: "ff2296b0-c0c7-438f-bb78-6822a09a99c9"). InnerVolumeSpecName "kube-api-access-z9z4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.929490 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "ff2296b0-c0c7-438f-bb78-6822a09a99c9" (UID: "ff2296b0-c0c7-438f-bb78-6822a09a99c9"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.930386 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5m72\" (UniqueName: \"kubernetes.io/projected/056c1c70-dc14-4d77-8396-0c52f4c909b4-kube-api-access-c5m72\") pod \"watcher58b3-account-delete-9jsr6\" (UID: \"056c1c70-dc14-4d77-8396-0c52f4c909b4\") " pod="watcher-kuttl-default/watcher58b3-account-delete-9jsr6" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.930472 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/056c1c70-dc14-4d77-8396-0c52f4c909b4-operator-scripts\") pod \"watcher58b3-account-delete-9jsr6\" (UID: \"056c1c70-dc14-4d77-8396-0c52f4c909b4\") " pod="watcher-kuttl-default/watcher58b3-account-delete-9jsr6" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.930548 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff2296b0-c0c7-438f-bb78-6822a09a99c9-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.930559 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.930569 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9z4w\" (UniqueName: \"kubernetes.io/projected/ff2296b0-c0c7-438f-bb78-6822a09a99c9-kube-api-access-z9z4w\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.943190 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff2296b0-c0c7-438f-bb78-6822a09a99c9" (UID: "ff2296b0-c0c7-438f-bb78-6822a09a99c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.977876 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:38:57 crc kubenswrapper[4744]: I1205 20:38:57.978209 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52" containerName="watcher-applier" containerID="cri-o://0bcf6bb7c1a1ad47c5cc62fdff3c38853496e0359cb6409d7568a7f115d1ee22" gracePeriod=30 Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.000921 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-config-data" (OuterVolumeSpecName: "config-data") pod "ff2296b0-c0c7-438f-bb78-6822a09a99c9" (UID: "ff2296b0-c0c7-438f-bb78-6822a09a99c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.005855 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "ff2296b0-c0c7-438f-bb78-6822a09a99c9" (UID: "ff2296b0-c0c7-438f-bb78-6822a09a99c9"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.036107 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5m72\" (UniqueName: \"kubernetes.io/projected/056c1c70-dc14-4d77-8396-0c52f4c909b4-kube-api-access-c5m72\") pod \"watcher58b3-account-delete-9jsr6\" (UID: \"056c1c70-dc14-4d77-8396-0c52f4c909b4\") " pod="watcher-kuttl-default/watcher58b3-account-delete-9jsr6" Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.036198 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/056c1c70-dc14-4d77-8396-0c52f4c909b4-operator-scripts\") pod \"watcher58b3-account-delete-9jsr6\" (UID: \"056c1c70-dc14-4d77-8396-0c52f4c909b4\") " pod="watcher-kuttl-default/watcher58b3-account-delete-9jsr6" Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.036312 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.036324 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.036333 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2296b0-c0c7-438f-bb78-6822a09a99c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.036878 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/056c1c70-dc14-4d77-8396-0c52f4c909b4-operator-scripts\") pod \"watcher58b3-account-delete-9jsr6\" (UID: \"056c1c70-dc14-4d77-8396-0c52f4c909b4\") " pod="watcher-kuttl-default/watcher58b3-account-delete-9jsr6" Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.053851 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5m72\" (UniqueName: \"kubernetes.io/projected/056c1c70-dc14-4d77-8396-0c52f4c909b4-kube-api-access-c5m72\") pod \"watcher58b3-account-delete-9jsr6\" (UID: \"056c1c70-dc14-4d77-8396-0c52f4c909b4\") " pod="watcher-kuttl-default/watcher58b3-account-delete-9jsr6" Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.092800 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c1fb22d-2926-4e72-946d-164071db6f9a" path="/var/lib/kubelet/pods/3c1fb22d-2926-4e72-946d-164071db6f9a/volumes" Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.105904 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher58b3-account-delete-9jsr6" Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.607582 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff2296b0-c0c7-438f-bb78-6822a09a99c9","Type":"ContainerDied","Data":"bd5c20e46a20a72c1260cef921f23466506f0c74a85f4023beea3b4bcd010361"} Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.608931 4744 scope.go:117] "RemoveContainer" containerID="954d6cee9a7894117b849451cb7da52d12047e8b6d5f7d979fc6675a7b6f6670" Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.607656 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.630880 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher58b3-account-delete-9jsr6"] Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.634204 4744 scope.go:117] "RemoveContainer" containerID="d47ddcbdb724dfc894a20117962245e1ce9dd721921887429c95a481efd1e07a" Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.656925 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:38:58 crc kubenswrapper[4744]: I1205 20:38:58.668280 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:38:59 crc kubenswrapper[4744]: I1205 20:38:59.618892 4744 generic.go:334] "Generic (PLEG): container finished" podID="056c1c70-dc14-4d77-8396-0c52f4c909b4" containerID="000acd9f79f8e63fdef9b0ea4b5f7b17fe7ffdfdb291e73f0f2b2611c1d7bb10" exitCode=0 Dec 05 20:38:59 crc kubenswrapper[4744]: I1205 20:38:59.618970 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher58b3-account-delete-9jsr6" event={"ID":"056c1c70-dc14-4d77-8396-0c52f4c909b4","Type":"ContainerDied","Data":"000acd9f79f8e63fdef9b0ea4b5f7b17fe7ffdfdb291e73f0f2b2611c1d7bb10"} Dec 05 20:38:59 crc kubenswrapper[4744]: I1205 20:38:59.619125 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher58b3-account-delete-9jsr6" event={"ID":"056c1c70-dc14-4d77-8396-0c52f4c909b4","Type":"ContainerStarted","Data":"1f0b672fc1bb23d550df1cfed105d843ba0a381902972bf828b0893565d9930c"} Dec 05 20:38:59 crc kubenswrapper[4744]: E1205 20:38:59.764513 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0bcf6bb7c1a1ad47c5cc62fdff3c38853496e0359cb6409d7568a7f115d1ee22" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:38:59 crc kubenswrapper[4744]: E1205 20:38:59.766333 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0bcf6bb7c1a1ad47c5cc62fdff3c38853496e0359cb6409d7568a7f115d1ee22" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:38:59 crc kubenswrapper[4744]: E1205 20:38:59.767525 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0bcf6bb7c1a1ad47c5cc62fdff3c38853496e0359cb6409d7568a7f115d1ee22" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:38:59 crc kubenswrapper[4744]: E1205 20:38:59.767554 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52" containerName="watcher-applier" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.091157 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff2296b0-c0c7-438f-bb78-6822a09a99c9" path="/var/lib/kubelet/pods/ff2296b0-c0c7-438f-bb78-6822a09a99c9/volumes" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.233164 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.233434 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="ceilometer-central-agent" containerID="cri-o://c5bb4b4629df53111760f03c0a359a03c8c3565b4fbcd0683135ce85c1f17874" gracePeriod=30 Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.233722 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="sg-core" containerID="cri-o://3950d8fd46870f54d6706f6ff9fdfb88f979e4f06fc8fed1e1675cc301931a54" gracePeriod=30 Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.233820 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="proxy-httpd" containerID="cri-o://5c205e0f71ec4d9227f2522d210f8b3f49a1a994b8b48786f45dbdd796d502d7" gracePeriod=30 Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.233990 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="ceilometer-notification-agent" containerID="cri-o://7effb828d86c405022713afb41f36c08d7c197f0d81dd070b4dcd12d210fc744" gracePeriod=30 Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.271485 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.424789 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.577145 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8917b45b-f26b-459d-b5a3-a37c6237f112-logs\") pod \"8917b45b-f26b-459d-b5a3-a37c6237f112\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.577387 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-custom-prometheus-ca\") pod \"8917b45b-f26b-459d-b5a3-a37c6237f112\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.577462 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-config-data\") pod \"8917b45b-f26b-459d-b5a3-a37c6237f112\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.577486 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-combined-ca-bundle\") pod \"8917b45b-f26b-459d-b5a3-a37c6237f112\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.577526 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmrzn\" (UniqueName: \"kubernetes.io/projected/8917b45b-f26b-459d-b5a3-a37c6237f112-kube-api-access-mmrzn\") pod \"8917b45b-f26b-459d-b5a3-a37c6237f112\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.577563 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-cert-memcached-mtls\") pod \"8917b45b-f26b-459d-b5a3-a37c6237f112\" (UID: \"8917b45b-f26b-459d-b5a3-a37c6237f112\") " Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.578683 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8917b45b-f26b-459d-b5a3-a37c6237f112-logs" (OuterVolumeSpecName: "logs") pod "8917b45b-f26b-459d-b5a3-a37c6237f112" (UID: "8917b45b-f26b-459d-b5a3-a37c6237f112"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.583671 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8917b45b-f26b-459d-b5a3-a37c6237f112-kube-api-access-mmrzn" (OuterVolumeSpecName: "kube-api-access-mmrzn") pod "8917b45b-f26b-459d-b5a3-a37c6237f112" (UID: "8917b45b-f26b-459d-b5a3-a37c6237f112"). InnerVolumeSpecName "kube-api-access-mmrzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.601849 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "8917b45b-f26b-459d-b5a3-a37c6237f112" (UID: "8917b45b-f26b-459d-b5a3-a37c6237f112"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.605206 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8917b45b-f26b-459d-b5a3-a37c6237f112" (UID: "8917b45b-f26b-459d-b5a3-a37c6237f112"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.623104 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-config-data" (OuterVolumeSpecName: "config-data") pod "8917b45b-f26b-459d-b5a3-a37c6237f112" (UID: "8917b45b-f26b-459d-b5a3-a37c6237f112"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.629956 4744 generic.go:334] "Generic (PLEG): container finished" podID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerID="5c205e0f71ec4d9227f2522d210f8b3f49a1a994b8b48786f45dbdd796d502d7" exitCode=0 Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.629990 4744 generic.go:334] "Generic (PLEG): container finished" podID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerID="3950d8fd46870f54d6706f6ff9fdfb88f979e4f06fc8fed1e1675cc301931a54" exitCode=2 Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.629998 4744 generic.go:334] "Generic (PLEG): container finished" podID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerID="c5bb4b4629df53111760f03c0a359a03c8c3565b4fbcd0683135ce85c1f17874" exitCode=0 Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.630039 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f0215aeb-42a9-4032-95fc-cb5a9389ddd3","Type":"ContainerDied","Data":"5c205e0f71ec4d9227f2522d210f8b3f49a1a994b8b48786f45dbdd796d502d7"} Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.630081 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f0215aeb-42a9-4032-95fc-cb5a9389ddd3","Type":"ContainerDied","Data":"3950d8fd46870f54d6706f6ff9fdfb88f979e4f06fc8fed1e1675cc301931a54"} Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.630091 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f0215aeb-42a9-4032-95fc-cb5a9389ddd3","Type":"ContainerDied","Data":"c5bb4b4629df53111760f03c0a359a03c8c3565b4fbcd0683135ce85c1f17874"} Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.631525 4744 generic.go:334] "Generic (PLEG): container finished" podID="8917b45b-f26b-459d-b5a3-a37c6237f112" containerID="d12de4ffe9019684f3ce1463cf1dfc935901530c69b301735b3e53edc2867a47" exitCode=0 Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.631548 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"8917b45b-f26b-459d-b5a3-a37c6237f112","Type":"ContainerDied","Data":"d12de4ffe9019684f3ce1463cf1dfc935901530c69b301735b3e53edc2867a47"} Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.631572 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"8917b45b-f26b-459d-b5a3-a37c6237f112","Type":"ContainerDied","Data":"fc8a0a97c603689ae23587d6604b8c1ac1a0c520281cfb7a770bb2a8b9f943e7"} Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.631591 4744 scope.go:117] "RemoveContainer" containerID="d12de4ffe9019684f3ce1463cf1dfc935901530c69b301735b3e53edc2867a47" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.631593 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.641777 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "8917b45b-f26b-459d-b5a3-a37c6237f112" (UID: "8917b45b-f26b-459d-b5a3-a37c6237f112"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.655904 4744 scope.go:117] "RemoveContainer" containerID="d12de4ffe9019684f3ce1463cf1dfc935901530c69b301735b3e53edc2867a47" Dec 05 20:39:00 crc kubenswrapper[4744]: E1205 20:39:00.661044 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12de4ffe9019684f3ce1463cf1dfc935901530c69b301735b3e53edc2867a47\": container with ID starting with d12de4ffe9019684f3ce1463cf1dfc935901530c69b301735b3e53edc2867a47 not found: ID does not exist" containerID="d12de4ffe9019684f3ce1463cf1dfc935901530c69b301735b3e53edc2867a47" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.661099 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12de4ffe9019684f3ce1463cf1dfc935901530c69b301735b3e53edc2867a47"} err="failed to get container status \"d12de4ffe9019684f3ce1463cf1dfc935901530c69b301735b3e53edc2867a47\": rpc error: code = NotFound desc = could not find container \"d12de4ffe9019684f3ce1463cf1dfc935901530c69b301735b3e53edc2867a47\": container with ID starting with d12de4ffe9019684f3ce1463cf1dfc935901530c69b301735b3e53edc2867a47 not found: ID does not exist" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.679225 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8917b45b-f26b-459d-b5a3-a37c6237f112-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.679262 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.679272 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.679282 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.679312 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmrzn\" (UniqueName: \"kubernetes.io/projected/8917b45b-f26b-459d-b5a3-a37c6237f112-kube-api-access-mmrzn\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.679324 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/8917b45b-f26b-459d-b5a3-a37c6237f112-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.943305 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher58b3-account-delete-9jsr6" Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.990352 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:39:00 crc kubenswrapper[4744]: I1205 20:39:00.996725 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:39:01 crc kubenswrapper[4744]: I1205 20:39:01.085740 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5m72\" (UniqueName: \"kubernetes.io/projected/056c1c70-dc14-4d77-8396-0c52f4c909b4-kube-api-access-c5m72\") pod \"056c1c70-dc14-4d77-8396-0c52f4c909b4\" (UID: \"056c1c70-dc14-4d77-8396-0c52f4c909b4\") " Dec 05 20:39:01 crc kubenswrapper[4744]: I1205 20:39:01.085838 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/056c1c70-dc14-4d77-8396-0c52f4c909b4-operator-scripts\") pod \"056c1c70-dc14-4d77-8396-0c52f4c909b4\" (UID: \"056c1c70-dc14-4d77-8396-0c52f4c909b4\") " Dec 05 20:39:01 crc kubenswrapper[4744]: I1205 20:39:01.086578 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/056c1c70-dc14-4d77-8396-0c52f4c909b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "056c1c70-dc14-4d77-8396-0c52f4c909b4" (UID: "056c1c70-dc14-4d77-8396-0c52f4c909b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:39:01 crc kubenswrapper[4744]: I1205 20:39:01.089932 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/056c1c70-dc14-4d77-8396-0c52f4c909b4-kube-api-access-c5m72" (OuterVolumeSpecName: "kube-api-access-c5m72") pod "056c1c70-dc14-4d77-8396-0c52f4c909b4" (UID: "056c1c70-dc14-4d77-8396-0c52f4c909b4"). InnerVolumeSpecName "kube-api-access-c5m72". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:39:01 crc kubenswrapper[4744]: I1205 20:39:01.187515 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5m72\" (UniqueName: \"kubernetes.io/projected/056c1c70-dc14-4d77-8396-0c52f4c909b4-kube-api-access-c5m72\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:01 crc kubenswrapper[4744]: I1205 20:39:01.187547 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/056c1c70-dc14-4d77-8396-0c52f4c909b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:01 crc kubenswrapper[4744]: I1205 20:39:01.644432 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher58b3-account-delete-9jsr6" event={"ID":"056c1c70-dc14-4d77-8396-0c52f4c909b4","Type":"ContainerDied","Data":"1f0b672fc1bb23d550df1cfed105d843ba0a381902972bf828b0893565d9930c"} Dec 05 20:39:01 crc kubenswrapper[4744]: I1205 20:39:01.644477 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f0b672fc1bb23d550df1cfed105d843ba0a381902972bf828b0893565d9930c" Dec 05 20:39:01 crc kubenswrapper[4744]: I1205 20:39:01.644521 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher58b3-account-delete-9jsr6" Dec 05 20:39:01 crc kubenswrapper[4744]: E1205 20:39:01.820695 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod056c1c70_dc14_4d77_8396_0c52f4c909b4.slice/crio-1f0b672fc1bb23d550df1cfed105d843ba0a381902972bf828b0893565d9930c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod056c1c70_dc14_4d77_8396_0c52f4c909b4.slice\": RecentStats: unable to find data in memory cache]" Dec 05 20:39:02 crc kubenswrapper[4744]: I1205 20:39:02.094909 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8917b45b-f26b-459d-b5a3-a37c6237f112" path="/var/lib/kubelet/pods/8917b45b-f26b-459d-b5a3-a37c6237f112/volumes" Dec 05 20:39:02 crc kubenswrapper[4744]: I1205 20:39:02.855539 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-876kl"] Dec 05 20:39:02 crc kubenswrapper[4744]: I1205 20:39:02.863715 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-876kl"] Dec 05 20:39:02 crc kubenswrapper[4744]: I1205 20:39:02.874195 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher58b3-account-delete-9jsr6"] Dec 05 20:39:02 crc kubenswrapper[4744]: I1205 20:39:02.882478 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5"] Dec 05 20:39:02 crc kubenswrapper[4744]: I1205 20:39:02.891196 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher58b3-account-delete-9jsr6"] Dec 05 20:39:02 crc kubenswrapper[4744]: I1205 20:39:02.896675 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-58b3-account-create-update-6dpz5"] Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.080087 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:39:03 crc kubenswrapper[4744]: E1205 20:39:03.080380 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.680417 4744 generic.go:334] "Generic (PLEG): container finished" podID="309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52" containerID="0bcf6bb7c1a1ad47c5cc62fdff3c38853496e0359cb6409d7568a7f115d1ee22" exitCode=0 Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.680482 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52","Type":"ContainerDied","Data":"0bcf6bb7c1a1ad47c5cc62fdff3c38853496e0359cb6409d7568a7f115d1ee22"} Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.766844 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.836624 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrwjs\" (UniqueName: \"kubernetes.io/projected/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-kube-api-access-xrwjs\") pod \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.836698 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-cert-memcached-mtls\") pod \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.836733 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-logs\") pod \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.836763 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-combined-ca-bundle\") pod \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.836874 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-config-data\") pod \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\" (UID: \"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52\") " Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.838016 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-logs" (OuterVolumeSpecName: "logs") pod "309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52" (UID: "309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.848653 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-kube-api-access-xrwjs" (OuterVolumeSpecName: "kube-api-access-xrwjs") pod "309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52" (UID: "309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52"). InnerVolumeSpecName "kube-api-access-xrwjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.864618 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52" (UID: "309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.908626 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-config-data" (OuterVolumeSpecName: "config-data") pod "309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52" (UID: "309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.919580 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52" (UID: "309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.938424 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.938453 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrwjs\" (UniqueName: \"kubernetes.io/projected/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-kube-api-access-xrwjs\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.938464 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.938473 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:03 crc kubenswrapper[4744]: I1205 20:39:03.938480 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.096012 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="056c1c70-dc14-4d77-8396-0c52f4c909b4" path="/var/lib/kubelet/pods/056c1c70-dc14-4d77-8396-0c52f4c909b4/volumes" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.097632 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e57f5b4f-fe7b-406e-ac8f-c934b2149ae4" path="/var/lib/kubelet/pods/e57f5b4f-fe7b-406e-ac8f-c934b2149ae4/volumes" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.098623 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea3ec182-e3b5-464b-b973-e2599aff944c" path="/var/lib/kubelet/pods/ea3ec182-e3b5-464b-b973-e2599aff944c/volumes" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.460546 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.549451 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-ceilometer-tls-certs\") pod \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.549524 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-scripts\") pod \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.549568 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-config-data\") pod \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.549594 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-log-httpd\") pod \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.549623 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dtkw\" (UniqueName: \"kubernetes.io/projected/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-kube-api-access-7dtkw\") pod \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.549670 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-run-httpd\") pod \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.549766 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-sg-core-conf-yaml\") pod \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.549816 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-combined-ca-bundle\") pod \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\" (UID: \"f0215aeb-42a9-4032-95fc-cb5a9389ddd3\") " Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.550075 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f0215aeb-42a9-4032-95fc-cb5a9389ddd3" (UID: "f0215aeb-42a9-4032-95fc-cb5a9389ddd3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.550338 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f0215aeb-42a9-4032-95fc-cb5a9389ddd3" (UID: "f0215aeb-42a9-4032-95fc-cb5a9389ddd3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.550890 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.550916 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.553689 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-scripts" (OuterVolumeSpecName: "scripts") pod "f0215aeb-42a9-4032-95fc-cb5a9389ddd3" (UID: "f0215aeb-42a9-4032-95fc-cb5a9389ddd3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.558489 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-kube-api-access-7dtkw" (OuterVolumeSpecName: "kube-api-access-7dtkw") pod "f0215aeb-42a9-4032-95fc-cb5a9389ddd3" (UID: "f0215aeb-42a9-4032-95fc-cb5a9389ddd3"). InnerVolumeSpecName "kube-api-access-7dtkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.574911 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f0215aeb-42a9-4032-95fc-cb5a9389ddd3" (UID: "f0215aeb-42a9-4032-95fc-cb5a9389ddd3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.606434 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f0215aeb-42a9-4032-95fc-cb5a9389ddd3" (UID: "f0215aeb-42a9-4032-95fc-cb5a9389ddd3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.624860 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0215aeb-42a9-4032-95fc-cb5a9389ddd3" (UID: "f0215aeb-42a9-4032-95fc-cb5a9389ddd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.639189 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-config-data" (OuterVolumeSpecName: "config-data") pod "f0215aeb-42a9-4032-95fc-cb5a9389ddd3" (UID: "f0215aeb-42a9-4032-95fc-cb5a9389ddd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.653267 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.653323 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.653335 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.653349 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.653360 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.653370 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dtkw\" (UniqueName: \"kubernetes.io/projected/f0215aeb-42a9-4032-95fc-cb5a9389ddd3-kube-api-access-7dtkw\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.691543 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52","Type":"ContainerDied","Data":"9217686e4d37b2bd01d9511ff923355820b6b06abcbe3158c6545e0e6f0d003a"} Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.691589 4744 scope.go:117] "RemoveContainer" containerID="0bcf6bb7c1a1ad47c5cc62fdff3c38853496e0359cb6409d7568a7f115d1ee22" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.691767 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.702748 4744 generic.go:334] "Generic (PLEG): container finished" podID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerID="7effb828d86c405022713afb41f36c08d7c197f0d81dd070b4dcd12d210fc744" exitCode=0 Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.702831 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f0215aeb-42a9-4032-95fc-cb5a9389ddd3","Type":"ContainerDied","Data":"7effb828d86c405022713afb41f36c08d7c197f0d81dd070b4dcd12d210fc744"} Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.703092 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f0215aeb-42a9-4032-95fc-cb5a9389ddd3","Type":"ContainerDied","Data":"fd2e2057e45f73ef25e4371994b30689f0faef8e87e524b697413cce06d87101"} Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.703138 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.719342 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.725196 4744 scope.go:117] "RemoveContainer" containerID="5c205e0f71ec4d9227f2522d210f8b3f49a1a994b8b48786f45dbdd796d502d7" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.725651 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.753313 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.755819 4744 scope.go:117] "RemoveContainer" containerID="3950d8fd46870f54d6706f6ff9fdfb88f979e4f06fc8fed1e1675cc301931a54" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.762470 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.774630 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:39:04 crc kubenswrapper[4744]: E1205 20:39:04.774942 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="ceilometer-notification-agent" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.774958 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="ceilometer-notification-agent" Dec 05 20:39:04 crc kubenswrapper[4744]: E1205 20:39:04.774977 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="056c1c70-dc14-4d77-8396-0c52f4c909b4" containerName="mariadb-account-delete" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.774985 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="056c1c70-dc14-4d77-8396-0c52f4c909b4" containerName="mariadb-account-delete" Dec 05 20:39:04 crc kubenswrapper[4744]: E1205 20:39:04.775365 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="ceilometer-central-agent" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.775373 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="ceilometer-central-agent" Dec 05 20:39:04 crc kubenswrapper[4744]: E1205 20:39:04.775384 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52" containerName="watcher-applier" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.775394 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52" containerName="watcher-applier" Dec 05 20:39:04 crc kubenswrapper[4744]: E1205 20:39:04.775409 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="proxy-httpd" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.775417 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="proxy-httpd" Dec 05 20:39:04 crc kubenswrapper[4744]: E1205 20:39:04.775426 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="sg-core" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.775432 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="sg-core" Dec 05 20:39:04 crc kubenswrapper[4744]: E1205 20:39:04.775439 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8917b45b-f26b-459d-b5a3-a37c6237f112" containerName="watcher-decision-engine" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.775445 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8917b45b-f26b-459d-b5a3-a37c6237f112" containerName="watcher-decision-engine" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.775636 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8917b45b-f26b-459d-b5a3-a37c6237f112" containerName="watcher-decision-engine" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.775647 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="ceilometer-central-agent" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.775658 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52" containerName="watcher-applier" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.775670 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="056c1c70-dc14-4d77-8396-0c52f4c909b4" containerName="mariadb-account-delete" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.775683 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="proxy-httpd" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.775693 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="ceilometer-notification-agent" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.775721 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" containerName="sg-core" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.777350 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.780742 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.780967 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.781047 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.781108 4744 scope.go:117] "RemoveContainer" containerID="7effb828d86c405022713afb41f36c08d7c197f0d81dd070b4dcd12d210fc744" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.805824 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.822101 4744 scope.go:117] "RemoveContainer" containerID="c5bb4b4629df53111760f03c0a359a03c8c3565b4fbcd0683135ce85c1f17874" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.855275 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014808f0-474c-4664-a953-c8fa28de9765-log-httpd\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.855345 4744 scope.go:117] "RemoveContainer" containerID="5c205e0f71ec4d9227f2522d210f8b3f49a1a994b8b48786f45dbdd796d502d7" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.855404 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-scripts\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.855447 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.855526 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwrcw\" (UniqueName: \"kubernetes.io/projected/014808f0-474c-4664-a953-c8fa28de9765-kube-api-access-rwrcw\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.855593 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-config-data\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.855707 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.855734 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014808f0-474c-4664-a953-c8fa28de9765-run-httpd\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.855756 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: E1205 20:39:04.855811 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c205e0f71ec4d9227f2522d210f8b3f49a1a994b8b48786f45dbdd796d502d7\": container with ID starting with 5c205e0f71ec4d9227f2522d210f8b3f49a1a994b8b48786f45dbdd796d502d7 not found: ID does not exist" containerID="5c205e0f71ec4d9227f2522d210f8b3f49a1a994b8b48786f45dbdd796d502d7" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.855877 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c205e0f71ec4d9227f2522d210f8b3f49a1a994b8b48786f45dbdd796d502d7"} err="failed to get container status \"5c205e0f71ec4d9227f2522d210f8b3f49a1a994b8b48786f45dbdd796d502d7\": rpc error: code = NotFound desc = could not find container \"5c205e0f71ec4d9227f2522d210f8b3f49a1a994b8b48786f45dbdd796d502d7\": container with ID starting with 5c205e0f71ec4d9227f2522d210f8b3f49a1a994b8b48786f45dbdd796d502d7 not found: ID does not exist" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.855919 4744 scope.go:117] "RemoveContainer" containerID="3950d8fd46870f54d6706f6ff9fdfb88f979e4f06fc8fed1e1675cc301931a54" Dec 05 20:39:04 crc kubenswrapper[4744]: E1205 20:39:04.856425 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3950d8fd46870f54d6706f6ff9fdfb88f979e4f06fc8fed1e1675cc301931a54\": container with ID starting with 3950d8fd46870f54d6706f6ff9fdfb88f979e4f06fc8fed1e1675cc301931a54 not found: ID does not exist" containerID="3950d8fd46870f54d6706f6ff9fdfb88f979e4f06fc8fed1e1675cc301931a54" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.856474 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3950d8fd46870f54d6706f6ff9fdfb88f979e4f06fc8fed1e1675cc301931a54"} err="failed to get container status \"3950d8fd46870f54d6706f6ff9fdfb88f979e4f06fc8fed1e1675cc301931a54\": rpc error: code = NotFound desc = could not find container \"3950d8fd46870f54d6706f6ff9fdfb88f979e4f06fc8fed1e1675cc301931a54\": container with ID starting with 3950d8fd46870f54d6706f6ff9fdfb88f979e4f06fc8fed1e1675cc301931a54 not found: ID does not exist" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.856493 4744 scope.go:117] "RemoveContainer" containerID="7effb828d86c405022713afb41f36c08d7c197f0d81dd070b4dcd12d210fc744" Dec 05 20:39:04 crc kubenswrapper[4744]: E1205 20:39:04.856804 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7effb828d86c405022713afb41f36c08d7c197f0d81dd070b4dcd12d210fc744\": container with ID starting with 7effb828d86c405022713afb41f36c08d7c197f0d81dd070b4dcd12d210fc744 not found: ID does not exist" containerID="7effb828d86c405022713afb41f36c08d7c197f0d81dd070b4dcd12d210fc744" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.856838 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7effb828d86c405022713afb41f36c08d7c197f0d81dd070b4dcd12d210fc744"} err="failed to get container status \"7effb828d86c405022713afb41f36c08d7c197f0d81dd070b4dcd12d210fc744\": rpc error: code = NotFound desc = could not find container \"7effb828d86c405022713afb41f36c08d7c197f0d81dd070b4dcd12d210fc744\": container with ID starting with 7effb828d86c405022713afb41f36c08d7c197f0d81dd070b4dcd12d210fc744 not found: ID does not exist" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.856863 4744 scope.go:117] "RemoveContainer" containerID="c5bb4b4629df53111760f03c0a359a03c8c3565b4fbcd0683135ce85c1f17874" Dec 05 20:39:04 crc kubenswrapper[4744]: E1205 20:39:04.857177 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5bb4b4629df53111760f03c0a359a03c8c3565b4fbcd0683135ce85c1f17874\": container with ID starting with c5bb4b4629df53111760f03c0a359a03c8c3565b4fbcd0683135ce85c1f17874 not found: ID does not exist" containerID="c5bb4b4629df53111760f03c0a359a03c8c3565b4fbcd0683135ce85c1f17874" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.857203 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5bb4b4629df53111760f03c0a359a03c8c3565b4fbcd0683135ce85c1f17874"} err="failed to get container status \"c5bb4b4629df53111760f03c0a359a03c8c3565b4fbcd0683135ce85c1f17874\": rpc error: code = NotFound desc = could not find container \"c5bb4b4629df53111760f03c0a359a03c8c3565b4fbcd0683135ce85c1f17874\": container with ID starting with c5bb4b4629df53111760f03c0a359a03c8c3565b4fbcd0683135ce85c1f17874 not found: ID does not exist" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.957002 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014808f0-474c-4664-a953-c8fa28de9765-log-httpd\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.957065 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-scripts\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.957090 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.957118 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwrcw\" (UniqueName: \"kubernetes.io/projected/014808f0-474c-4664-a953-c8fa28de9765-kube-api-access-rwrcw\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.957141 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-config-data\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.957185 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.957200 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014808f0-474c-4664-a953-c8fa28de9765-run-httpd\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.957216 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.958193 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014808f0-474c-4664-a953-c8fa28de9765-log-httpd\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.958705 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014808f0-474c-4664-a953-c8fa28de9765-run-httpd\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.962506 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.963166 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.963165 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-config-data\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.964863 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-scripts\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.971807 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:04 crc kubenswrapper[4744]: I1205 20:39:04.976454 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwrcw\" (UniqueName: \"kubernetes.io/projected/014808f0-474c-4664-a953-c8fa28de9765-kube-api-access-rwrcw\") pod \"ceilometer-0\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:05 crc kubenswrapper[4744]: I1205 20:39:05.106098 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:05 crc kubenswrapper[4744]: I1205 20:39:05.606527 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:39:05 crc kubenswrapper[4744]: I1205 20:39:05.715849 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"014808f0-474c-4664-a953-c8fa28de9765","Type":"ContainerStarted","Data":"300ff1383206ef2ff935085dfef34acec075dc6b62510eddf4f702a5a5af8391"} Dec 05 20:39:06 crc kubenswrapper[4744]: I1205 20:39:06.091551 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52" path="/var/lib/kubelet/pods/309dd5ae-4de9-4ec1-9dc5-8d5c2cfd1a52/volumes" Dec 05 20:39:06 crc kubenswrapper[4744]: I1205 20:39:06.092374 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0215aeb-42a9-4032-95fc-cb5a9389ddd3" path="/var/lib/kubelet/pods/f0215aeb-42a9-4032-95fc-cb5a9389ddd3/volumes" Dec 05 20:39:06 crc kubenswrapper[4744]: I1205 20:39:06.725972 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"014808f0-474c-4664-a953-c8fa28de9765","Type":"ContainerStarted","Data":"7d28298fdbd1a4feafd9b525f8017a20cc706a5ef3851e93df12399f830f72f8"} Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.307801 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-sgc5p"] Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.308928 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-sgc5p" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.325247 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t"] Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.327154 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.329169 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.339220 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-sgc5p"] Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.352752 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t"] Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.397741 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83db5e02-d8cc-4e8b-88d4-f00b916e34fb-operator-scripts\") pod \"watcher-db-create-sgc5p\" (UID: \"83db5e02-d8cc-4e8b-88d4-f00b916e34fb\") " pod="watcher-kuttl-default/watcher-db-create-sgc5p" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.397791 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0677bd03-1b77-4a98-8270-79816ee729bb-operator-scripts\") pod \"watcher-ab56-account-create-update-8gh9t\" (UID: \"0677bd03-1b77-4a98-8270-79816ee729bb\") " pod="watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.397816 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx8ft\" (UniqueName: \"kubernetes.io/projected/0677bd03-1b77-4a98-8270-79816ee729bb-kube-api-access-vx8ft\") pod \"watcher-ab56-account-create-update-8gh9t\" (UID: \"0677bd03-1b77-4a98-8270-79816ee729bb\") " pod="watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.397853 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9mdj\" (UniqueName: \"kubernetes.io/projected/83db5e02-d8cc-4e8b-88d4-f00b916e34fb-kube-api-access-v9mdj\") pod \"watcher-db-create-sgc5p\" (UID: \"83db5e02-d8cc-4e8b-88d4-f00b916e34fb\") " pod="watcher-kuttl-default/watcher-db-create-sgc5p" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.499247 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83db5e02-d8cc-4e8b-88d4-f00b916e34fb-operator-scripts\") pod \"watcher-db-create-sgc5p\" (UID: \"83db5e02-d8cc-4e8b-88d4-f00b916e34fb\") " pod="watcher-kuttl-default/watcher-db-create-sgc5p" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.499394 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0677bd03-1b77-4a98-8270-79816ee729bb-operator-scripts\") pod \"watcher-ab56-account-create-update-8gh9t\" (UID: \"0677bd03-1b77-4a98-8270-79816ee729bb\") " pod="watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.499454 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx8ft\" (UniqueName: \"kubernetes.io/projected/0677bd03-1b77-4a98-8270-79816ee729bb-kube-api-access-vx8ft\") pod \"watcher-ab56-account-create-update-8gh9t\" (UID: \"0677bd03-1b77-4a98-8270-79816ee729bb\") " pod="watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.499550 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9mdj\" (UniqueName: \"kubernetes.io/projected/83db5e02-d8cc-4e8b-88d4-f00b916e34fb-kube-api-access-v9mdj\") pod \"watcher-db-create-sgc5p\" (UID: \"83db5e02-d8cc-4e8b-88d4-f00b916e34fb\") " pod="watcher-kuttl-default/watcher-db-create-sgc5p" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.500092 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83db5e02-d8cc-4e8b-88d4-f00b916e34fb-operator-scripts\") pod \"watcher-db-create-sgc5p\" (UID: \"83db5e02-d8cc-4e8b-88d4-f00b916e34fb\") " pod="watcher-kuttl-default/watcher-db-create-sgc5p" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.503319 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0677bd03-1b77-4a98-8270-79816ee729bb-operator-scripts\") pod \"watcher-ab56-account-create-update-8gh9t\" (UID: \"0677bd03-1b77-4a98-8270-79816ee729bb\") " pod="watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.518003 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9mdj\" (UniqueName: \"kubernetes.io/projected/83db5e02-d8cc-4e8b-88d4-f00b916e34fb-kube-api-access-v9mdj\") pod \"watcher-db-create-sgc5p\" (UID: \"83db5e02-d8cc-4e8b-88d4-f00b916e34fb\") " pod="watcher-kuttl-default/watcher-db-create-sgc5p" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.520728 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx8ft\" (UniqueName: \"kubernetes.io/projected/0677bd03-1b77-4a98-8270-79816ee729bb-kube-api-access-vx8ft\") pod \"watcher-ab56-account-create-update-8gh9t\" (UID: \"0677bd03-1b77-4a98-8270-79816ee729bb\") " pod="watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.657608 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-sgc5p" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.666471 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t" Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.791330 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"014808f0-474c-4664-a953-c8fa28de9765","Type":"ContainerStarted","Data":"dc8d2218ee7b19bdd27a686f8109bff4491d28c5fd86be6dddf805cc7d3742e1"} Dec 05 20:39:07 crc kubenswrapper[4744]: I1205 20:39:07.791873 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"014808f0-474c-4664-a953-c8fa28de9765","Type":"ContainerStarted","Data":"31a04cc9a724771fe609f00f17b322b2285837048913dc9c48d58a140790a906"} Dec 05 20:39:08 crc kubenswrapper[4744]: I1205 20:39:08.303248 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-sgc5p"] Dec 05 20:39:08 crc kubenswrapper[4744]: W1205 20:39:08.308495 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83db5e02_d8cc_4e8b_88d4_f00b916e34fb.slice/crio-524cafe6a5c853c8d96c2de6683551e895fdafa3434bf07d7acc9ff92ccdd8f6 WatchSource:0}: Error finding container 524cafe6a5c853c8d96c2de6683551e895fdafa3434bf07d7acc9ff92ccdd8f6: Status 404 returned error can't find the container with id 524cafe6a5c853c8d96c2de6683551e895fdafa3434bf07d7acc9ff92ccdd8f6 Dec 05 20:39:08 crc kubenswrapper[4744]: I1205 20:39:08.393446 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t"] Dec 05 20:39:08 crc kubenswrapper[4744]: W1205 20:39:08.397718 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0677bd03_1b77_4a98_8270_79816ee729bb.slice/crio-647af3a19a231681ae117d497da55aa8c86b58c362fcad130cc18306d8575e27 WatchSource:0}: Error finding container 647af3a19a231681ae117d497da55aa8c86b58c362fcad130cc18306d8575e27: Status 404 returned error can't find the container with id 647af3a19a231681ae117d497da55aa8c86b58c362fcad130cc18306d8575e27 Dec 05 20:39:08 crc kubenswrapper[4744]: I1205 20:39:08.801399 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t" event={"ID":"0677bd03-1b77-4a98-8270-79816ee729bb","Type":"ContainerStarted","Data":"87d306283b8f8d70c87a7ad7547dac093940d3b3e5a81c886b13f4a91e4be5c9"} Dec 05 20:39:08 crc kubenswrapper[4744]: I1205 20:39:08.801724 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t" event={"ID":"0677bd03-1b77-4a98-8270-79816ee729bb","Type":"ContainerStarted","Data":"647af3a19a231681ae117d497da55aa8c86b58c362fcad130cc18306d8575e27"} Dec 05 20:39:08 crc kubenswrapper[4744]: I1205 20:39:08.803250 4744 generic.go:334] "Generic (PLEG): container finished" podID="83db5e02-d8cc-4e8b-88d4-f00b916e34fb" containerID="b979aee5296f25ac4ce2be7b962db5441e75c8e7f51bbf152627f541483ca5e2" exitCode=0 Dec 05 20:39:08 crc kubenswrapper[4744]: I1205 20:39:08.803322 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-sgc5p" event={"ID":"83db5e02-d8cc-4e8b-88d4-f00b916e34fb","Type":"ContainerDied","Data":"b979aee5296f25ac4ce2be7b962db5441e75c8e7f51bbf152627f541483ca5e2"} Dec 05 20:39:08 crc kubenswrapper[4744]: I1205 20:39:08.803342 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-sgc5p" event={"ID":"83db5e02-d8cc-4e8b-88d4-f00b916e34fb","Type":"ContainerStarted","Data":"524cafe6a5c853c8d96c2de6683551e895fdafa3434bf07d7acc9ff92ccdd8f6"} Dec 05 20:39:08 crc kubenswrapper[4744]: I1205 20:39:08.806478 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"014808f0-474c-4664-a953-c8fa28de9765","Type":"ContainerStarted","Data":"8872adf11c156bce6ec679062ac4c91c814e480605804e16122b7ce6ae042cd8"} Dec 05 20:39:08 crc kubenswrapper[4744]: I1205 20:39:08.806673 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:08 crc kubenswrapper[4744]: I1205 20:39:08.818949 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t" podStartSLOduration=1.8189321550000002 podStartE2EDuration="1.818932155s" podCreationTimestamp="2025-12-05 20:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:39:08.816853138 +0000 UTC m=+1719.046664506" watchObservedRunningTime="2025-12-05 20:39:08.818932155 +0000 UTC m=+1719.048743533" Dec 05 20:39:08 crc kubenswrapper[4744]: I1205 20:39:08.838548 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.965806086 podStartE2EDuration="4.8385303s" podCreationTimestamp="2025-12-05 20:39:04 +0000 UTC" firstStartedPulling="2025-12-05 20:39:05.607773771 +0000 UTC m=+1715.837585179" lastFinishedPulling="2025-12-05 20:39:08.480498025 +0000 UTC m=+1718.710309393" observedRunningTime="2025-12-05 20:39:08.83669854 +0000 UTC m=+1719.066509908" watchObservedRunningTime="2025-12-05 20:39:08.8385303 +0000 UTC m=+1719.068341668" Dec 05 20:39:09 crc kubenswrapper[4744]: I1205 20:39:09.821013 4744 generic.go:334] "Generic (PLEG): container finished" podID="0677bd03-1b77-4a98-8270-79816ee729bb" containerID="87d306283b8f8d70c87a7ad7547dac093940d3b3e5a81c886b13f4a91e4be5c9" exitCode=0 Dec 05 20:39:09 crc kubenswrapper[4744]: I1205 20:39:09.821076 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t" event={"ID":"0677bd03-1b77-4a98-8270-79816ee729bb","Type":"ContainerDied","Data":"87d306283b8f8d70c87a7ad7547dac093940d3b3e5a81c886b13f4a91e4be5c9"} Dec 05 20:39:10 crc kubenswrapper[4744]: I1205 20:39:10.176590 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-sgc5p" Dec 05 20:39:10 crc kubenswrapper[4744]: I1205 20:39:10.265543 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83db5e02-d8cc-4e8b-88d4-f00b916e34fb-operator-scripts\") pod \"83db5e02-d8cc-4e8b-88d4-f00b916e34fb\" (UID: \"83db5e02-d8cc-4e8b-88d4-f00b916e34fb\") " Dec 05 20:39:10 crc kubenswrapper[4744]: I1205 20:39:10.265696 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9mdj\" (UniqueName: \"kubernetes.io/projected/83db5e02-d8cc-4e8b-88d4-f00b916e34fb-kube-api-access-v9mdj\") pod \"83db5e02-d8cc-4e8b-88d4-f00b916e34fb\" (UID: \"83db5e02-d8cc-4e8b-88d4-f00b916e34fb\") " Dec 05 20:39:10 crc kubenswrapper[4744]: I1205 20:39:10.266313 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83db5e02-d8cc-4e8b-88d4-f00b916e34fb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83db5e02-d8cc-4e8b-88d4-f00b916e34fb" (UID: "83db5e02-d8cc-4e8b-88d4-f00b916e34fb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:39:10 crc kubenswrapper[4744]: I1205 20:39:10.272481 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83db5e02-d8cc-4e8b-88d4-f00b916e34fb-kube-api-access-v9mdj" (OuterVolumeSpecName: "kube-api-access-v9mdj") pod "83db5e02-d8cc-4e8b-88d4-f00b916e34fb" (UID: "83db5e02-d8cc-4e8b-88d4-f00b916e34fb"). InnerVolumeSpecName "kube-api-access-v9mdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:39:10 crc kubenswrapper[4744]: I1205 20:39:10.367998 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9mdj\" (UniqueName: \"kubernetes.io/projected/83db5e02-d8cc-4e8b-88d4-f00b916e34fb-kube-api-access-v9mdj\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:10 crc kubenswrapper[4744]: I1205 20:39:10.368039 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83db5e02-d8cc-4e8b-88d4-f00b916e34fb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:10 crc kubenswrapper[4744]: I1205 20:39:10.835278 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-sgc5p" event={"ID":"83db5e02-d8cc-4e8b-88d4-f00b916e34fb","Type":"ContainerDied","Data":"524cafe6a5c853c8d96c2de6683551e895fdafa3434bf07d7acc9ff92ccdd8f6"} Dec 05 20:39:10 crc kubenswrapper[4744]: I1205 20:39:10.835343 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-sgc5p" Dec 05 20:39:10 crc kubenswrapper[4744]: I1205 20:39:10.835360 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="524cafe6a5c853c8d96c2de6683551e895fdafa3434bf07d7acc9ff92ccdd8f6" Dec 05 20:39:11 crc kubenswrapper[4744]: I1205 20:39:11.167776 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t" Dec 05 20:39:11 crc kubenswrapper[4744]: I1205 20:39:11.284163 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0677bd03-1b77-4a98-8270-79816ee729bb-operator-scripts\") pod \"0677bd03-1b77-4a98-8270-79816ee729bb\" (UID: \"0677bd03-1b77-4a98-8270-79816ee729bb\") " Dec 05 20:39:11 crc kubenswrapper[4744]: I1205 20:39:11.284233 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx8ft\" (UniqueName: \"kubernetes.io/projected/0677bd03-1b77-4a98-8270-79816ee729bb-kube-api-access-vx8ft\") pod \"0677bd03-1b77-4a98-8270-79816ee729bb\" (UID: \"0677bd03-1b77-4a98-8270-79816ee729bb\") " Dec 05 20:39:11 crc kubenswrapper[4744]: I1205 20:39:11.284778 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0677bd03-1b77-4a98-8270-79816ee729bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0677bd03-1b77-4a98-8270-79816ee729bb" (UID: "0677bd03-1b77-4a98-8270-79816ee729bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:39:11 crc kubenswrapper[4744]: I1205 20:39:11.288821 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0677bd03-1b77-4a98-8270-79816ee729bb-kube-api-access-vx8ft" (OuterVolumeSpecName: "kube-api-access-vx8ft") pod "0677bd03-1b77-4a98-8270-79816ee729bb" (UID: "0677bd03-1b77-4a98-8270-79816ee729bb"). InnerVolumeSpecName "kube-api-access-vx8ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:39:11 crc kubenswrapper[4744]: I1205 20:39:11.387197 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0677bd03-1b77-4a98-8270-79816ee729bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:11 crc kubenswrapper[4744]: I1205 20:39:11.387256 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx8ft\" (UniqueName: \"kubernetes.io/projected/0677bd03-1b77-4a98-8270-79816ee729bb-kube-api-access-vx8ft\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:11 crc kubenswrapper[4744]: I1205 20:39:11.845193 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t" event={"ID":"0677bd03-1b77-4a98-8270-79816ee729bb","Type":"ContainerDied","Data":"647af3a19a231681ae117d497da55aa8c86b58c362fcad130cc18306d8575e27"} Dec 05 20:39:11 crc kubenswrapper[4744]: I1205 20:39:11.845509 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="647af3a19a231681ae117d497da55aa8c86b58c362fcad130cc18306d8575e27" Dec 05 20:39:11 crc kubenswrapper[4744]: I1205 20:39:11.845562 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.646519 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zjftx"] Dec 05 20:39:12 crc kubenswrapper[4744]: E1205 20:39:12.646860 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0677bd03-1b77-4a98-8270-79816ee729bb" containerName="mariadb-account-create-update" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.646876 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0677bd03-1b77-4a98-8270-79816ee729bb" containerName="mariadb-account-create-update" Dec 05 20:39:12 crc kubenswrapper[4744]: E1205 20:39:12.646905 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83db5e02-d8cc-4e8b-88d4-f00b916e34fb" containerName="mariadb-database-create" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.646912 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="83db5e02-d8cc-4e8b-88d4-f00b916e34fb" containerName="mariadb-database-create" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.647070 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0677bd03-1b77-4a98-8270-79816ee729bb" containerName="mariadb-account-create-update" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.647088 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="83db5e02-d8cc-4e8b-88d4-f00b916e34fb" containerName="mariadb-database-create" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.647701 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.651014 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.651142 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-mdbsh" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.665898 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zjftx"] Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.809574 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mrjd\" (UniqueName: \"kubernetes.io/projected/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-kube-api-access-5mrjd\") pod \"watcher-kuttl-db-sync-zjftx\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.809907 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-db-sync-config-data\") pod \"watcher-kuttl-db-sync-zjftx\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.809933 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-config-data\") pod \"watcher-kuttl-db-sync-zjftx\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.809957 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-zjftx\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.911102 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-db-sync-config-data\") pod \"watcher-kuttl-db-sync-zjftx\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.911163 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-config-data\") pod \"watcher-kuttl-db-sync-zjftx\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.911198 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-zjftx\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.911268 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mrjd\" (UniqueName: \"kubernetes.io/projected/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-kube-api-access-5mrjd\") pod \"watcher-kuttl-db-sync-zjftx\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.917649 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-db-sync-config-data\") pod \"watcher-kuttl-db-sync-zjftx\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.919026 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-config-data\") pod \"watcher-kuttl-db-sync-zjftx\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.935568 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-zjftx\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.936116 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mrjd\" (UniqueName: \"kubernetes.io/projected/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-kube-api-access-5mrjd\") pod \"watcher-kuttl-db-sync-zjftx\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:12 crc kubenswrapper[4744]: I1205 20:39:12.968653 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:13 crc kubenswrapper[4744]: I1205 20:39:13.466833 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zjftx"] Dec 05 20:39:13 crc kubenswrapper[4744]: W1205 20:39:13.474019 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96afaf9f_e24d_42ae_8d45_dcc85b9663c9.slice/crio-353488537f018cc390150fd6c0e73e0f6eea5fce82474670c3b439d48096904a WatchSource:0}: Error finding container 353488537f018cc390150fd6c0e73e0f6eea5fce82474670c3b439d48096904a: Status 404 returned error can't find the container with id 353488537f018cc390150fd6c0e73e0f6eea5fce82474670c3b439d48096904a Dec 05 20:39:13 crc kubenswrapper[4744]: I1205 20:39:13.874359 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" event={"ID":"96afaf9f-e24d-42ae-8d45-dcc85b9663c9","Type":"ContainerStarted","Data":"24dcd57e4c27bb2b600d6f5f61855187c6a46378fcbdc25e54daf802874edad6"} Dec 05 20:39:13 crc kubenswrapper[4744]: I1205 20:39:13.874409 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" event={"ID":"96afaf9f-e24d-42ae-8d45-dcc85b9663c9","Type":"ContainerStarted","Data":"353488537f018cc390150fd6c0e73e0f6eea5fce82474670c3b439d48096904a"} Dec 05 20:39:13 crc kubenswrapper[4744]: I1205 20:39:13.891649 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" podStartSLOduration=1.8916227559999998 podStartE2EDuration="1.891622756s" podCreationTimestamp="2025-12-05 20:39:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:39:13.890287467 +0000 UTC m=+1724.120098875" watchObservedRunningTime="2025-12-05 20:39:13.891622756 +0000 UTC m=+1724.121434134" Dec 05 20:39:15 crc kubenswrapper[4744]: I1205 20:39:15.895736 4744 generic.go:334] "Generic (PLEG): container finished" podID="96afaf9f-e24d-42ae-8d45-dcc85b9663c9" containerID="24dcd57e4c27bb2b600d6f5f61855187c6a46378fcbdc25e54daf802874edad6" exitCode=0 Dec 05 20:39:15 crc kubenswrapper[4744]: I1205 20:39:15.895800 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" event={"ID":"96afaf9f-e24d-42ae-8d45-dcc85b9663c9","Type":"ContainerDied","Data":"24dcd57e4c27bb2b600d6f5f61855187c6a46378fcbdc25e54daf802874edad6"} Dec 05 20:39:16 crc kubenswrapper[4744]: I1205 20:39:16.082095 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:39:16 crc kubenswrapper[4744]: E1205 20:39:16.082499 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.255817 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.382675 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-config-data\") pod \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.382804 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-combined-ca-bundle\") pod \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.382873 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mrjd\" (UniqueName: \"kubernetes.io/projected/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-kube-api-access-5mrjd\") pod \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.382929 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-db-sync-config-data\") pod \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\" (UID: \"96afaf9f-e24d-42ae-8d45-dcc85b9663c9\") " Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.388980 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "96afaf9f-e24d-42ae-8d45-dcc85b9663c9" (UID: "96afaf9f-e24d-42ae-8d45-dcc85b9663c9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.389219 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-kube-api-access-5mrjd" (OuterVolumeSpecName: "kube-api-access-5mrjd") pod "96afaf9f-e24d-42ae-8d45-dcc85b9663c9" (UID: "96afaf9f-e24d-42ae-8d45-dcc85b9663c9"). InnerVolumeSpecName "kube-api-access-5mrjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.424509 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96afaf9f-e24d-42ae-8d45-dcc85b9663c9" (UID: "96afaf9f-e24d-42ae-8d45-dcc85b9663c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.453161 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-config-data" (OuterVolumeSpecName: "config-data") pod "96afaf9f-e24d-42ae-8d45-dcc85b9663c9" (UID: "96afaf9f-e24d-42ae-8d45-dcc85b9663c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.484148 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mrjd\" (UniqueName: \"kubernetes.io/projected/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-kube-api-access-5mrjd\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.484186 4744 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.484196 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.484204 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96afaf9f-e24d-42ae-8d45-dcc85b9663c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.918882 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" event={"ID":"96afaf9f-e24d-42ae-8d45-dcc85b9663c9","Type":"ContainerDied","Data":"353488537f018cc390150fd6c0e73e0f6eea5fce82474670c3b439d48096904a"} Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.918930 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="353488537f018cc390150fd6c0e73e0f6eea5fce82474670c3b439d48096904a" Dec 05 20:39:17 crc kubenswrapper[4744]: I1205 20:39:17.918995 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zjftx" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.348162 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:39:18 crc kubenswrapper[4744]: E1205 20:39:18.348562 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96afaf9f-e24d-42ae-8d45-dcc85b9663c9" containerName="watcher-kuttl-db-sync" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.348581 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="96afaf9f-e24d-42ae-8d45-dcc85b9663c9" containerName="watcher-kuttl-db-sync" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.348796 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="96afaf9f-e24d-42ae-8d45-dcc85b9663c9" containerName="watcher-kuttl-db-sync" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.349509 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.352949 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-mdbsh" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.353055 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.368598 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.407520 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.409038 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.411820 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.422906 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.424767 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.427829 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.433065 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.452267 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.500701 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ww5h\" (UniqueName: \"kubernetes.io/projected/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-kube-api-access-5ww5h\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.500747 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.500777 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.500890 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.500931 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.500974 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.500999 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t7x6\" (UniqueName: \"kubernetes.io/projected/07c4b443-fda4-4eca-b1ad-4423b01e3aad-kube-api-access-6t7x6\") pod \"watcher-kuttl-applier-0\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.501034 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07c4b443-fda4-4eca-b1ad-4423b01e3aad-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.501080 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-logs\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.501111 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.501169 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5p79\" (UniqueName: \"kubernetes.io/projected/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-kube-api-access-t5p79\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.501209 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.501224 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.501272 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.501356 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.501500 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.501564 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603342 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603388 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603416 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603433 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t7x6\" (UniqueName: \"kubernetes.io/projected/07c4b443-fda4-4eca-b1ad-4423b01e3aad-kube-api-access-6t7x6\") pod \"watcher-kuttl-applier-0\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603454 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07c4b443-fda4-4eca-b1ad-4423b01e3aad-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603481 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-logs\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603502 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603532 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5p79\" (UniqueName: \"kubernetes.io/projected/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-kube-api-access-t5p79\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603555 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603570 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603594 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603613 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603635 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603657 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603689 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ww5h\" (UniqueName: \"kubernetes.io/projected/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-kube-api-access-5ww5h\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603708 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603724 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.603937 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07c4b443-fda4-4eca-b1ad-4423b01e3aad-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.604154 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-logs\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.604712 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.607656 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.608528 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.608661 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.608765 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.608970 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.609188 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.609431 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.609654 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.612150 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.617940 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.626888 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.637917 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ww5h\" (UniqueName: \"kubernetes.io/projected/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-kube-api-access-5ww5h\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.638919 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t7x6\" (UniqueName: \"kubernetes.io/projected/07c4b443-fda4-4eca-b1ad-4423b01e3aad-kube-api-access-6t7x6\") pod \"watcher-kuttl-applier-0\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.642842 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5p79\" (UniqueName: \"kubernetes.io/projected/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-kube-api-access-t5p79\") pod \"watcher-kuttl-api-0\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.669900 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.738037 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:18 crc kubenswrapper[4744]: I1205 20:39:18.760884 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:19 crc kubenswrapper[4744]: I1205 20:39:19.175319 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:39:19 crc kubenswrapper[4744]: W1205 20:39:19.182890 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea3a6e04_6c1e_4661_914a_be0fb1ea8792.slice/crio-17dcd65dcf4422324515c9a90c877cb339f13f5ff5beb0e8d3c41e70f2d764d4 WatchSource:0}: Error finding container 17dcd65dcf4422324515c9a90c877cb339f13f5ff5beb0e8d3c41e70f2d764d4: Status 404 returned error can't find the container with id 17dcd65dcf4422324515c9a90c877cb339f13f5ff5beb0e8d3c41e70f2d764d4 Dec 05 20:39:19 crc kubenswrapper[4744]: W1205 20:39:19.291656 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07c4b443_fda4_4eca_b1ad_4423b01e3aad.slice/crio-941ad5be3ec32832dd02da5ebc14716d95ec1e9e0d54960438271fce610fc57d WatchSource:0}: Error finding container 941ad5be3ec32832dd02da5ebc14716d95ec1e9e0d54960438271fce610fc57d: Status 404 returned error can't find the container with id 941ad5be3ec32832dd02da5ebc14716d95ec1e9e0d54960438271fce610fc57d Dec 05 20:39:19 crc kubenswrapper[4744]: I1205 20:39:19.292835 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:39:19 crc kubenswrapper[4744]: I1205 20:39:19.354308 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:39:19 crc kubenswrapper[4744]: W1205 20:39:19.358615 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b3ca55d_f6d9_4b59_81b4_4295f3b20d18.slice/crio-100420df580e9186e391e63ce87bed4ebf97acbeb04c654e16b08382981e3c54 WatchSource:0}: Error finding container 100420df580e9186e391e63ce87bed4ebf97acbeb04c654e16b08382981e3c54: Status 404 returned error can't find the container with id 100420df580e9186e391e63ce87bed4ebf97acbeb04c654e16b08382981e3c54 Dec 05 20:39:19 crc kubenswrapper[4744]: I1205 20:39:19.948711 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"ea3a6e04-6c1e-4661-914a-be0fb1ea8792","Type":"ContainerStarted","Data":"07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7"} Dec 05 20:39:19 crc kubenswrapper[4744]: I1205 20:39:19.949028 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"ea3a6e04-6c1e-4661-914a-be0fb1ea8792","Type":"ContainerStarted","Data":"17dcd65dcf4422324515c9a90c877cb339f13f5ff5beb0e8d3c41e70f2d764d4"} Dec 05 20:39:19 crc kubenswrapper[4744]: I1205 20:39:19.951510 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18","Type":"ContainerStarted","Data":"b1a337111fad3b293940c5a211a7482d3f2534b5f27a314bbd0ce2fa0dde64d2"} Dec 05 20:39:19 crc kubenswrapper[4744]: I1205 20:39:19.951544 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18","Type":"ContainerStarted","Data":"a0599f3dec2c07796ce0a3dfeb132cb40663c4bc85ec1b13c669b5d202b8a2d7"} Dec 05 20:39:19 crc kubenswrapper[4744]: I1205 20:39:19.951560 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18","Type":"ContainerStarted","Data":"100420df580e9186e391e63ce87bed4ebf97acbeb04c654e16b08382981e3c54"} Dec 05 20:39:19 crc kubenswrapper[4744]: I1205 20:39:19.952430 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:19 crc kubenswrapper[4744]: I1205 20:39:19.954980 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"07c4b443-fda4-4eca-b1ad-4423b01e3aad","Type":"ContainerStarted","Data":"dd76b0bb0a339f67a079b017ab055266bfd00727403284b81bfe4483240aed85"} Dec 05 20:39:19 crc kubenswrapper[4744]: I1205 20:39:19.955011 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"07c4b443-fda4-4eca-b1ad-4423b01e3aad","Type":"ContainerStarted","Data":"941ad5be3ec32832dd02da5ebc14716d95ec1e9e0d54960438271fce610fc57d"} Dec 05 20:39:19 crc kubenswrapper[4744]: I1205 20:39:19.983279 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.983254326 podStartE2EDuration="1.983254326s" podCreationTimestamp="2025-12-05 20:39:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:39:19.973714514 +0000 UTC m=+1730.203525892" watchObservedRunningTime="2025-12-05 20:39:19.983254326 +0000 UTC m=+1730.213065714" Dec 05 20:39:20 crc kubenswrapper[4744]: I1205 20:39:20.001267 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.001244805 podStartE2EDuration="2.001244805s" podCreationTimestamp="2025-12-05 20:39:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:39:19.992495971 +0000 UTC m=+1730.222307359" watchObservedRunningTime="2025-12-05 20:39:20.001244805 +0000 UTC m=+1730.231056183" Dec 05 20:39:20 crc kubenswrapper[4744]: I1205 20:39:20.018959 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.018938259 podStartE2EDuration="2.018938259s" podCreationTimestamp="2025-12-05 20:39:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:39:20.012393563 +0000 UTC m=+1730.242204931" watchObservedRunningTime="2025-12-05 20:39:20.018938259 +0000 UTC m=+1730.248749627" Dec 05 20:39:21 crc kubenswrapper[4744]: I1205 20:39:21.036742 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:21 crc kubenswrapper[4744]: I1205 20:39:21.989393 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 20:39:22 crc kubenswrapper[4744]: I1205 20:39:22.187773 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:22 crc kubenswrapper[4744]: I1205 20:39:22.318505 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:23 crc kubenswrapper[4744]: I1205 20:39:23.373683 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:23 crc kubenswrapper[4744]: I1205 20:39:23.738655 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:23 crc kubenswrapper[4744]: I1205 20:39:23.761240 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:24 crc kubenswrapper[4744]: I1205 20:39:24.562491 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:25 crc kubenswrapper[4744]: I1205 20:39:25.747273 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:26 crc kubenswrapper[4744]: I1205 20:39:26.986831 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:28 crc kubenswrapper[4744]: I1205 20:39:28.194040 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:28 crc kubenswrapper[4744]: I1205 20:39:28.670839 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:28 crc kubenswrapper[4744]: I1205 20:39:28.704474 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:28 crc kubenswrapper[4744]: I1205 20:39:28.738469 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:28 crc kubenswrapper[4744]: I1205 20:39:28.762442 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:28 crc kubenswrapper[4744]: I1205 20:39:28.772379 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:28 crc kubenswrapper[4744]: I1205 20:39:28.773938 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:29 crc kubenswrapper[4744]: I1205 20:39:29.050729 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:29 crc kubenswrapper[4744]: I1205 20:39:29.059809 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:39:29 crc kubenswrapper[4744]: I1205 20:39:29.080456 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:39:29 crc kubenswrapper[4744]: E1205 20:39:29.080702 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:39:29 crc kubenswrapper[4744]: I1205 20:39:29.086161 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:39:29 crc kubenswrapper[4744]: I1205 20:39:29.107172 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:39:29 crc kubenswrapper[4744]: I1205 20:39:29.361484 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:30 crc kubenswrapper[4744]: I1205 20:39:30.521413 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:30 crc kubenswrapper[4744]: I1205 20:39:30.780941 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.131018 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-db-create-s6pk7"] Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.132347 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-s6pk7" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.142787 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-create-s6pk7"] Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.249674 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-2955-account-create-update-td9j2"] Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.250972 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-2955-account-create-update-td9j2" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.253454 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-db-secret" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.264822 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-2955-account-create-update-td9j2"] Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.309621 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f95dec5c-262e-4af1-a941-7a0d0dc36853-operator-scripts\") pod \"cinder-db-create-s6pk7\" (UID: \"f95dec5c-262e-4af1-a941-7a0d0dc36853\") " pod="watcher-kuttl-default/cinder-db-create-s6pk7" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.309720 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsw49\" (UniqueName: \"kubernetes.io/projected/f95dec5c-262e-4af1-a941-7a0d0dc36853-kube-api-access-vsw49\") pod \"cinder-db-create-s6pk7\" (UID: \"f95dec5c-262e-4af1-a941-7a0d0dc36853\") " pod="watcher-kuttl-default/cinder-db-create-s6pk7" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.372474 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.372877 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="ceilometer-notification-agent" containerID="cri-o://31a04cc9a724771fe609f00f17b322b2285837048913dc9c48d58a140790a906" gracePeriod=30 Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.372914 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="sg-core" containerID="cri-o://dc8d2218ee7b19bdd27a686f8109bff4491d28c5fd86be6dddf805cc7d3742e1" gracePeriod=30 Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.373015 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="proxy-httpd" containerID="cri-o://8872adf11c156bce6ec679062ac4c91c814e480605804e16122b7ce6ae042cd8" gracePeriod=30 Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.372790 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="ceilometer-central-agent" containerID="cri-o://7d28298fdbd1a4feafd9b525f8017a20cc706a5ef3851e93df12399f830f72f8" gracePeriod=30 Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.386033 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.187:3000/\": EOF" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.410960 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f95dec5c-262e-4af1-a941-7a0d0dc36853-operator-scripts\") pod \"cinder-db-create-s6pk7\" (UID: \"f95dec5c-262e-4af1-a941-7a0d0dc36853\") " pod="watcher-kuttl-default/cinder-db-create-s6pk7" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.411029 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/505e1786-269f-46b4-a25d-5a8e41f9334b-operator-scripts\") pod \"cinder-2955-account-create-update-td9j2\" (UID: \"505e1786-269f-46b4-a25d-5a8e41f9334b\") " pod="watcher-kuttl-default/cinder-2955-account-create-update-td9j2" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.411069 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsw49\" (UniqueName: \"kubernetes.io/projected/f95dec5c-262e-4af1-a941-7a0d0dc36853-kube-api-access-vsw49\") pod \"cinder-db-create-s6pk7\" (UID: \"f95dec5c-262e-4af1-a941-7a0d0dc36853\") " pod="watcher-kuttl-default/cinder-db-create-s6pk7" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.411126 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj2ms\" (UniqueName: \"kubernetes.io/projected/505e1786-269f-46b4-a25d-5a8e41f9334b-kube-api-access-sj2ms\") pod \"cinder-2955-account-create-update-td9j2\" (UID: \"505e1786-269f-46b4-a25d-5a8e41f9334b\") " pod="watcher-kuttl-default/cinder-2955-account-create-update-td9j2" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.411767 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f95dec5c-262e-4af1-a941-7a0d0dc36853-operator-scripts\") pod \"cinder-db-create-s6pk7\" (UID: \"f95dec5c-262e-4af1-a941-7a0d0dc36853\") " pod="watcher-kuttl-default/cinder-db-create-s6pk7" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.436056 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsw49\" (UniqueName: \"kubernetes.io/projected/f95dec5c-262e-4af1-a941-7a0d0dc36853-kube-api-access-vsw49\") pod \"cinder-db-create-s6pk7\" (UID: \"f95dec5c-262e-4af1-a941-7a0d0dc36853\") " pod="watcher-kuttl-default/cinder-db-create-s6pk7" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.477278 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-s6pk7" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.512088 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/505e1786-269f-46b4-a25d-5a8e41f9334b-operator-scripts\") pod \"cinder-2955-account-create-update-td9j2\" (UID: \"505e1786-269f-46b4-a25d-5a8e41f9334b\") " pod="watcher-kuttl-default/cinder-2955-account-create-update-td9j2" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.512832 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/505e1786-269f-46b4-a25d-5a8e41f9334b-operator-scripts\") pod \"cinder-2955-account-create-update-td9j2\" (UID: \"505e1786-269f-46b4-a25d-5a8e41f9334b\") " pod="watcher-kuttl-default/cinder-2955-account-create-update-td9j2" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.513026 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj2ms\" (UniqueName: \"kubernetes.io/projected/505e1786-269f-46b4-a25d-5a8e41f9334b-kube-api-access-sj2ms\") pod \"cinder-2955-account-create-update-td9j2\" (UID: \"505e1786-269f-46b4-a25d-5a8e41f9334b\") " pod="watcher-kuttl-default/cinder-2955-account-create-update-td9j2" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.532978 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj2ms\" (UniqueName: \"kubernetes.io/projected/505e1786-269f-46b4-a25d-5a8e41f9334b-kube-api-access-sj2ms\") pod \"cinder-2955-account-create-update-td9j2\" (UID: \"505e1786-269f-46b4-a25d-5a8e41f9334b\") " pod="watcher-kuttl-default/cinder-2955-account-create-update-td9j2" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.570896 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-2955-account-create-update-td9j2" Dec 05 20:39:31 crc kubenswrapper[4744]: I1205 20:39:31.952525 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:32 crc kubenswrapper[4744]: I1205 20:39:32.052900 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-create-s6pk7"] Dec 05 20:39:32 crc kubenswrapper[4744]: I1205 20:39:32.075077 4744 generic.go:334] "Generic (PLEG): container finished" podID="014808f0-474c-4664-a953-c8fa28de9765" containerID="8872adf11c156bce6ec679062ac4c91c814e480605804e16122b7ce6ae042cd8" exitCode=0 Dec 05 20:39:32 crc kubenswrapper[4744]: I1205 20:39:32.075116 4744 generic.go:334] "Generic (PLEG): container finished" podID="014808f0-474c-4664-a953-c8fa28de9765" containerID="dc8d2218ee7b19bdd27a686f8109bff4491d28c5fd86be6dddf805cc7d3742e1" exitCode=2 Dec 05 20:39:32 crc kubenswrapper[4744]: I1205 20:39:32.075127 4744 generic.go:334] "Generic (PLEG): container finished" podID="014808f0-474c-4664-a953-c8fa28de9765" containerID="7d28298fdbd1a4feafd9b525f8017a20cc706a5ef3851e93df12399f830f72f8" exitCode=0 Dec 05 20:39:32 crc kubenswrapper[4744]: I1205 20:39:32.075170 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"014808f0-474c-4664-a953-c8fa28de9765","Type":"ContainerDied","Data":"8872adf11c156bce6ec679062ac4c91c814e480605804e16122b7ce6ae042cd8"} Dec 05 20:39:32 crc kubenswrapper[4744]: I1205 20:39:32.075202 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"014808f0-474c-4664-a953-c8fa28de9765","Type":"ContainerDied","Data":"dc8d2218ee7b19bdd27a686f8109bff4491d28c5fd86be6dddf805cc7d3742e1"} Dec 05 20:39:32 crc kubenswrapper[4744]: I1205 20:39:32.075216 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"014808f0-474c-4664-a953-c8fa28de9765","Type":"ContainerDied","Data":"7d28298fdbd1a4feafd9b525f8017a20cc706a5ef3851e93df12399f830f72f8"} Dec 05 20:39:32 crc kubenswrapper[4744]: I1205 20:39:32.077006 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-create-s6pk7" event={"ID":"f95dec5c-262e-4af1-a941-7a0d0dc36853","Type":"ContainerStarted","Data":"207b01afb5efdf09f84c4600a4e4102f24ec317437a952e21727b63af1438a1f"} Dec 05 20:39:32 crc kubenswrapper[4744]: I1205 20:39:32.106667 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-2955-account-create-update-td9j2"] Dec 05 20:39:32 crc kubenswrapper[4744]: W1205 20:39:32.116329 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod505e1786_269f_46b4_a25d_5a8e41f9334b.slice/crio-195e2b38ce07aa4fce5a5a454aa6724ff689af9095c5885b33b7f19ea24d97a1 WatchSource:0}: Error finding container 195e2b38ce07aa4fce5a5a454aa6724ff689af9095c5885b33b7f19ea24d97a1: Status 404 returned error can't find the container with id 195e2b38ce07aa4fce5a5a454aa6724ff689af9095c5885b33b7f19ea24d97a1 Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.089551 4744 generic.go:334] "Generic (PLEG): container finished" podID="f95dec5c-262e-4af1-a941-7a0d0dc36853" containerID="7e5fe38608d4f8dc724b655ae172f9a982a4cd752fc32dbffee26ec3acf3d871" exitCode=0 Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.089700 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-create-s6pk7" event={"ID":"f95dec5c-262e-4af1-a941-7a0d0dc36853","Type":"ContainerDied","Data":"7e5fe38608d4f8dc724b655ae172f9a982a4cd752fc32dbffee26ec3acf3d871"} Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.093355 4744 generic.go:334] "Generic (PLEG): container finished" podID="505e1786-269f-46b4-a25d-5a8e41f9334b" containerID="dee98d68f5ccc437abde103b17cfd52ec3494a6186c3c32acfba284506b883b9" exitCode=0 Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.093411 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-2955-account-create-update-td9j2" event={"ID":"505e1786-269f-46b4-a25d-5a8e41f9334b","Type":"ContainerDied","Data":"dee98d68f5ccc437abde103b17cfd52ec3494a6186c3c32acfba284506b883b9"} Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.093443 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-2955-account-create-update-td9j2" event={"ID":"505e1786-269f-46b4-a25d-5a8e41f9334b","Type":"ContainerStarted","Data":"195e2b38ce07aa4fce5a5a454aa6724ff689af9095c5885b33b7f19ea24d97a1"} Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.123903 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.617159 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.747315 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-scripts\") pod \"014808f0-474c-4664-a953-c8fa28de9765\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.747372 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-combined-ca-bundle\") pod \"014808f0-474c-4664-a953-c8fa28de9765\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.747399 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014808f0-474c-4664-a953-c8fa28de9765-run-httpd\") pod \"014808f0-474c-4664-a953-c8fa28de9765\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.747504 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-ceilometer-tls-certs\") pod \"014808f0-474c-4664-a953-c8fa28de9765\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.747527 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-config-data\") pod \"014808f0-474c-4664-a953-c8fa28de9765\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.747546 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-sg-core-conf-yaml\") pod \"014808f0-474c-4664-a953-c8fa28de9765\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.747596 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwrcw\" (UniqueName: \"kubernetes.io/projected/014808f0-474c-4664-a953-c8fa28de9765-kube-api-access-rwrcw\") pod \"014808f0-474c-4664-a953-c8fa28de9765\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.747636 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014808f0-474c-4664-a953-c8fa28de9765-log-httpd\") pod \"014808f0-474c-4664-a953-c8fa28de9765\" (UID: \"014808f0-474c-4664-a953-c8fa28de9765\") " Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.748364 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/014808f0-474c-4664-a953-c8fa28de9765-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "014808f0-474c-4664-a953-c8fa28de9765" (UID: "014808f0-474c-4664-a953-c8fa28de9765"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.748681 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/014808f0-474c-4664-a953-c8fa28de9765-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "014808f0-474c-4664-a953-c8fa28de9765" (UID: "014808f0-474c-4664-a953-c8fa28de9765"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.754569 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-scripts" (OuterVolumeSpecName: "scripts") pod "014808f0-474c-4664-a953-c8fa28de9765" (UID: "014808f0-474c-4664-a953-c8fa28de9765"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.755372 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/014808f0-474c-4664-a953-c8fa28de9765-kube-api-access-rwrcw" (OuterVolumeSpecName: "kube-api-access-rwrcw") pod "014808f0-474c-4664-a953-c8fa28de9765" (UID: "014808f0-474c-4664-a953-c8fa28de9765"). InnerVolumeSpecName "kube-api-access-rwrcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.775789 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "014808f0-474c-4664-a953-c8fa28de9765" (UID: "014808f0-474c-4664-a953-c8fa28de9765"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.798589 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "014808f0-474c-4664-a953-c8fa28de9765" (UID: "014808f0-474c-4664-a953-c8fa28de9765"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.820165 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "014808f0-474c-4664-a953-c8fa28de9765" (UID: "014808f0-474c-4664-a953-c8fa28de9765"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.849923 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.849967 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.849981 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014808f0-474c-4664-a953-c8fa28de9765-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.849993 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.850007 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.850045 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwrcw\" (UniqueName: \"kubernetes.io/projected/014808f0-474c-4664-a953-c8fa28de9765-kube-api-access-rwrcw\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.850058 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014808f0-474c-4664-a953-c8fa28de9765-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.851574 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-config-data" (OuterVolumeSpecName: "config-data") pod "014808f0-474c-4664-a953-c8fa28de9765" (UID: "014808f0-474c-4664-a953-c8fa28de9765"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:39:33 crc kubenswrapper[4744]: I1205 20:39:33.954475 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014808f0-474c-4664-a953-c8fa28de9765-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.103492 4744 generic.go:334] "Generic (PLEG): container finished" podID="014808f0-474c-4664-a953-c8fa28de9765" containerID="31a04cc9a724771fe609f00f17b322b2285837048913dc9c48d58a140790a906" exitCode=0 Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.103735 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.109967 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"014808f0-474c-4664-a953-c8fa28de9765","Type":"ContainerDied","Data":"31a04cc9a724771fe609f00f17b322b2285837048913dc9c48d58a140790a906"} Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.110025 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"014808f0-474c-4664-a953-c8fa28de9765","Type":"ContainerDied","Data":"300ff1383206ef2ff935085dfef34acec075dc6b62510eddf4f702a5a5af8391"} Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.110047 4744 scope.go:117] "RemoveContainer" containerID="8872adf11c156bce6ec679062ac4c91c814e480605804e16122b7ce6ae042cd8" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.165359 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.174214 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.186473 4744 scope.go:117] "RemoveContainer" containerID="dc8d2218ee7b19bdd27a686f8109bff4491d28c5fd86be6dddf805cc7d3742e1" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.201788 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:39:34 crc kubenswrapper[4744]: E1205 20:39:34.202404 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="ceilometer-central-agent" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.202421 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="ceilometer-central-agent" Dec 05 20:39:34 crc kubenswrapper[4744]: E1205 20:39:34.202438 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="proxy-httpd" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.202446 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="proxy-httpd" Dec 05 20:39:34 crc kubenswrapper[4744]: E1205 20:39:34.202461 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="sg-core" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.202467 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="sg-core" Dec 05 20:39:34 crc kubenswrapper[4744]: E1205 20:39:34.202477 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="ceilometer-notification-agent" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.202485 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="ceilometer-notification-agent" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.202637 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="sg-core" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.203541 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="ceilometer-notification-agent" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.203566 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="ceilometer-central-agent" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.203580 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="014808f0-474c-4664-a953-c8fa28de9765" containerName="proxy-httpd" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.211553 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.218325 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.219794 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.219915 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.240315 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.263477 4744 scope.go:117] "RemoveContainer" containerID="31a04cc9a724771fe609f00f17b322b2285837048913dc9c48d58a140790a906" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.310741 4744 scope.go:117] "RemoveContainer" containerID="7d28298fdbd1a4feafd9b525f8017a20cc706a5ef3851e93df12399f830f72f8" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.353240 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.367066 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.367116 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-scripts\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.367176 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2552a5f8-aaad-4da8-b439-6e20032e5a54-run-httpd\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.367195 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vddx\" (UniqueName: \"kubernetes.io/projected/2552a5f8-aaad-4da8-b439-6e20032e5a54-kube-api-access-7vddx\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.367220 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.367237 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-config-data\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.367253 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2552a5f8-aaad-4da8-b439-6e20032e5a54-log-httpd\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.367306 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.390789 4744 scope.go:117] "RemoveContainer" containerID="8872adf11c156bce6ec679062ac4c91c814e480605804e16122b7ce6ae042cd8" Dec 05 20:39:34 crc kubenswrapper[4744]: E1205 20:39:34.394413 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8872adf11c156bce6ec679062ac4c91c814e480605804e16122b7ce6ae042cd8\": container with ID starting with 8872adf11c156bce6ec679062ac4c91c814e480605804e16122b7ce6ae042cd8 not found: ID does not exist" containerID="8872adf11c156bce6ec679062ac4c91c814e480605804e16122b7ce6ae042cd8" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.394588 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8872adf11c156bce6ec679062ac4c91c814e480605804e16122b7ce6ae042cd8"} err="failed to get container status \"8872adf11c156bce6ec679062ac4c91c814e480605804e16122b7ce6ae042cd8\": rpc error: code = NotFound desc = could not find container \"8872adf11c156bce6ec679062ac4c91c814e480605804e16122b7ce6ae042cd8\": container with ID starting with 8872adf11c156bce6ec679062ac4c91c814e480605804e16122b7ce6ae042cd8 not found: ID does not exist" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.395107 4744 scope.go:117] "RemoveContainer" containerID="dc8d2218ee7b19bdd27a686f8109bff4491d28c5fd86be6dddf805cc7d3742e1" Dec 05 20:39:34 crc kubenswrapper[4744]: E1205 20:39:34.397384 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc8d2218ee7b19bdd27a686f8109bff4491d28c5fd86be6dddf805cc7d3742e1\": container with ID starting with dc8d2218ee7b19bdd27a686f8109bff4491d28c5fd86be6dddf805cc7d3742e1 not found: ID does not exist" containerID="dc8d2218ee7b19bdd27a686f8109bff4491d28c5fd86be6dddf805cc7d3742e1" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.397471 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc8d2218ee7b19bdd27a686f8109bff4491d28c5fd86be6dddf805cc7d3742e1"} err="failed to get container status \"dc8d2218ee7b19bdd27a686f8109bff4491d28c5fd86be6dddf805cc7d3742e1\": rpc error: code = NotFound desc = could not find container \"dc8d2218ee7b19bdd27a686f8109bff4491d28c5fd86be6dddf805cc7d3742e1\": container with ID starting with dc8d2218ee7b19bdd27a686f8109bff4491d28c5fd86be6dddf805cc7d3742e1 not found: ID does not exist" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.397548 4744 scope.go:117] "RemoveContainer" containerID="31a04cc9a724771fe609f00f17b322b2285837048913dc9c48d58a140790a906" Dec 05 20:39:34 crc kubenswrapper[4744]: E1205 20:39:34.404426 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31a04cc9a724771fe609f00f17b322b2285837048913dc9c48d58a140790a906\": container with ID starting with 31a04cc9a724771fe609f00f17b322b2285837048913dc9c48d58a140790a906 not found: ID does not exist" containerID="31a04cc9a724771fe609f00f17b322b2285837048913dc9c48d58a140790a906" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.404467 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a04cc9a724771fe609f00f17b322b2285837048913dc9c48d58a140790a906"} err="failed to get container status \"31a04cc9a724771fe609f00f17b322b2285837048913dc9c48d58a140790a906\": rpc error: code = NotFound desc = could not find container \"31a04cc9a724771fe609f00f17b322b2285837048913dc9c48d58a140790a906\": container with ID starting with 31a04cc9a724771fe609f00f17b322b2285837048913dc9c48d58a140790a906 not found: ID does not exist" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.404494 4744 scope.go:117] "RemoveContainer" containerID="7d28298fdbd1a4feafd9b525f8017a20cc706a5ef3851e93df12399f830f72f8" Dec 05 20:39:34 crc kubenswrapper[4744]: E1205 20:39:34.405066 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d28298fdbd1a4feafd9b525f8017a20cc706a5ef3851e93df12399f830f72f8\": container with ID starting with 7d28298fdbd1a4feafd9b525f8017a20cc706a5ef3851e93df12399f830f72f8 not found: ID does not exist" containerID="7d28298fdbd1a4feafd9b525f8017a20cc706a5ef3851e93df12399f830f72f8" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.405085 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d28298fdbd1a4feafd9b525f8017a20cc706a5ef3851e93df12399f830f72f8"} err="failed to get container status \"7d28298fdbd1a4feafd9b525f8017a20cc706a5ef3851e93df12399f830f72f8\": rpc error: code = NotFound desc = could not find container \"7d28298fdbd1a4feafd9b525f8017a20cc706a5ef3851e93df12399f830f72f8\": container with ID starting with 7d28298fdbd1a4feafd9b525f8017a20cc706a5ef3851e93df12399f830f72f8 not found: ID does not exist" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.468465 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-scripts\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.468611 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2552a5f8-aaad-4da8-b439-6e20032e5a54-run-httpd\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.468654 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vddx\" (UniqueName: \"kubernetes.io/projected/2552a5f8-aaad-4da8-b439-6e20032e5a54-kube-api-access-7vddx\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.468710 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.468744 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-config-data\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.468779 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2552a5f8-aaad-4da8-b439-6e20032e5a54-log-httpd\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.468831 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.469694 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2552a5f8-aaad-4da8-b439-6e20032e5a54-run-httpd\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.469772 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2552a5f8-aaad-4da8-b439-6e20032e5a54-log-httpd\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.469700 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.473277 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.477664 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.478083 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.478364 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-scripts\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.481403 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-config-data\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.487749 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vddx\" (UniqueName: \"kubernetes.io/projected/2552a5f8-aaad-4da8-b439-6e20032e5a54-kube-api-access-7vddx\") pod \"ceilometer-0\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.541899 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.724522 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-2955-account-create-update-td9j2" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.878588 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj2ms\" (UniqueName: \"kubernetes.io/projected/505e1786-269f-46b4-a25d-5a8e41f9334b-kube-api-access-sj2ms\") pod \"505e1786-269f-46b4-a25d-5a8e41f9334b\" (UID: \"505e1786-269f-46b4-a25d-5a8e41f9334b\") " Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.878728 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/505e1786-269f-46b4-a25d-5a8e41f9334b-operator-scripts\") pod \"505e1786-269f-46b4-a25d-5a8e41f9334b\" (UID: \"505e1786-269f-46b4-a25d-5a8e41f9334b\") " Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.879729 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/505e1786-269f-46b4-a25d-5a8e41f9334b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "505e1786-269f-46b4-a25d-5a8e41f9334b" (UID: "505e1786-269f-46b4-a25d-5a8e41f9334b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.884406 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/505e1786-269f-46b4-a25d-5a8e41f9334b-kube-api-access-sj2ms" (OuterVolumeSpecName: "kube-api-access-sj2ms") pod "505e1786-269f-46b4-a25d-5a8e41f9334b" (UID: "505e1786-269f-46b4-a25d-5a8e41f9334b"). InnerVolumeSpecName "kube-api-access-sj2ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.908696 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-s6pk7" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.980811 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/505e1786-269f-46b4-a25d-5a8e41f9334b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:34 crc kubenswrapper[4744]: I1205 20:39:34.980853 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj2ms\" (UniqueName: \"kubernetes.io/projected/505e1786-269f-46b4-a25d-5a8e41f9334b-kube-api-access-sj2ms\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.081453 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f95dec5c-262e-4af1-a941-7a0d0dc36853-operator-scripts\") pod \"f95dec5c-262e-4af1-a941-7a0d0dc36853\" (UID: \"f95dec5c-262e-4af1-a941-7a0d0dc36853\") " Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.081503 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsw49\" (UniqueName: \"kubernetes.io/projected/f95dec5c-262e-4af1-a941-7a0d0dc36853-kube-api-access-vsw49\") pod \"f95dec5c-262e-4af1-a941-7a0d0dc36853\" (UID: \"f95dec5c-262e-4af1-a941-7a0d0dc36853\") " Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.081990 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f95dec5c-262e-4af1-a941-7a0d0dc36853-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f95dec5c-262e-4af1-a941-7a0d0dc36853" (UID: "f95dec5c-262e-4af1-a941-7a0d0dc36853"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.085302 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f95dec5c-262e-4af1-a941-7a0d0dc36853-kube-api-access-vsw49" (OuterVolumeSpecName: "kube-api-access-vsw49") pod "f95dec5c-262e-4af1-a941-7a0d0dc36853" (UID: "f95dec5c-262e-4af1-a941-7a0d0dc36853"). InnerVolumeSpecName "kube-api-access-vsw49". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.092880 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:39:35 crc kubenswrapper[4744]: W1205 20:39:35.096325 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2552a5f8_aaad_4da8_b439_6e20032e5a54.slice/crio-10c4a471b2997f135ff62c0fd91bfb7fefe3d927909404ccf6d80ddd44f63acd WatchSource:0}: Error finding container 10c4a471b2997f135ff62c0fd91bfb7fefe3d927909404ccf6d80ddd44f63acd: Status 404 returned error can't find the container with id 10c4a471b2997f135ff62c0fd91bfb7fefe3d927909404ccf6d80ddd44f63acd Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.117043 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-create-s6pk7" event={"ID":"f95dec5c-262e-4af1-a941-7a0d0dc36853","Type":"ContainerDied","Data":"207b01afb5efdf09f84c4600a4e4102f24ec317437a952e21727b63af1438a1f"} Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.117107 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="207b01afb5efdf09f84c4600a4e4102f24ec317437a952e21727b63af1438a1f" Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.117054 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-s6pk7" Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.122367 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-2955-account-create-update-td9j2" event={"ID":"505e1786-269f-46b4-a25d-5a8e41f9334b","Type":"ContainerDied","Data":"195e2b38ce07aa4fce5a5a454aa6724ff689af9095c5885b33b7f19ea24d97a1"} Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.122407 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="195e2b38ce07aa4fce5a5a454aa6724ff689af9095c5885b33b7f19ea24d97a1" Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.122681 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-2955-account-create-update-td9j2" Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.123868 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2552a5f8-aaad-4da8-b439-6e20032e5a54","Type":"ContainerStarted","Data":"10c4a471b2997f135ff62c0fd91bfb7fefe3d927909404ccf6d80ddd44f63acd"} Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.183750 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f95dec5c-262e-4af1-a941-7a0d0dc36853-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.183785 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsw49\" (UniqueName: \"kubernetes.io/projected/f95dec5c-262e-4af1-a941-7a0d0dc36853-kube-api-access-vsw49\") on node \"crc\" DevicePath \"\"" Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.316760 4744 scope.go:117] "RemoveContainer" containerID="0422966fbd41de2219c78d2eba85d6aa76a02a9daa15ad7be299536d6b089aac" Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.339836 4744 scope.go:117] "RemoveContainer" containerID="096a072fc31b569bcb2876897ef894f9e2e1a82acb4b7eadb9b29e34be74b7b5" Dec 05 20:39:35 crc kubenswrapper[4744]: I1205 20:39:35.575127 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.090872 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="014808f0-474c-4664-a953-c8fa28de9765" path="/var/lib/kubelet/pods/014808f0-474c-4664-a953-c8fa28de9765/volumes" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.132410 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2552a5f8-aaad-4da8-b439-6e20032e5a54","Type":"ContainerStarted","Data":"88ae3ca80b5f18b3ece09c67bb4753ba487b775cc1cacd344539b3883f77d7ec"} Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.491537 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-db-sync-dhq9h"] Dec 05 20:39:36 crc kubenswrapper[4744]: E1205 20:39:36.492280 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505e1786-269f-46b4-a25d-5a8e41f9334b" containerName="mariadb-account-create-update" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.492323 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="505e1786-269f-46b4-a25d-5a8e41f9334b" containerName="mariadb-account-create-update" Dec 05 20:39:36 crc kubenswrapper[4744]: E1205 20:39:36.492349 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95dec5c-262e-4af1-a941-7a0d0dc36853" containerName="mariadb-database-create" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.492359 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95dec5c-262e-4af1-a941-7a0d0dc36853" containerName="mariadb-database-create" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.492642 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f95dec5c-262e-4af1-a941-7a0d0dc36853" containerName="mariadb-database-create" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.492671 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="505e1786-269f-46b4-a25d-5a8e41f9334b" containerName="mariadb-account-create-update" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.493399 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.495062 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-cinder-dockercfg-dv7z5" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.496893 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scripts" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.497480 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-config-data" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.506316 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-dhq9h"] Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.609633 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-db-sync-config-data\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.609676 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-combined-ca-bundle\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.609940 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfmkv\" (UniqueName: \"kubernetes.io/projected/1d27f0f9-e34f-43db-b882-3b3b3609d961-kube-api-access-sfmkv\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.610044 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-config-data\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.610110 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-scripts\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.610154 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d27f0f9-e34f-43db-b882-3b3b3609d961-etc-machine-id\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.711411 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfmkv\" (UniqueName: \"kubernetes.io/projected/1d27f0f9-e34f-43db-b882-3b3b3609d961-kube-api-access-sfmkv\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.711497 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-config-data\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.711565 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-scripts\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.711608 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d27f0f9-e34f-43db-b882-3b3b3609d961-etc-machine-id\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.711656 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-db-sync-config-data\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.711680 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-combined-ca-bundle\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.711751 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d27f0f9-e34f-43db-b882-3b3b3609d961-etc-machine-id\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.715788 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-scripts\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.716284 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-config-data\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.716323 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-db-sync-config-data\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.716363 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-combined-ca-bundle\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.747749 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfmkv\" (UniqueName: \"kubernetes.io/projected/1d27f0f9-e34f-43db-b882-3b3b3609d961-kube-api-access-sfmkv\") pod \"cinder-db-sync-dhq9h\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.789309 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:36 crc kubenswrapper[4744]: I1205 20:39:36.811119 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:39:37 crc kubenswrapper[4744]: I1205 20:39:37.149446 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2552a5f8-aaad-4da8-b439-6e20032e5a54","Type":"ContainerStarted","Data":"2c9ac59b54298d20ec962e28c8b555fec50245c9f212895738df6e78c933f045"} Dec 05 20:39:37 crc kubenswrapper[4744]: I1205 20:39:37.149798 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2552a5f8-aaad-4da8-b439-6e20032e5a54","Type":"ContainerStarted","Data":"89e9150ad52d3798fe737ec3d53fdac3cd062f83599d7e776c0ef7ee14806f70"} Dec 05 20:39:37 crc kubenswrapper[4744]: W1205 20:39:37.317225 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d27f0f9_e34f_43db_b882_3b3b3609d961.slice/crio-339f7a864986211a1b6d6f0e7c45d73a010fc0a34376482d2f320d12051c3065 WatchSource:0}: Error finding container 339f7a864986211a1b6d6f0e7c45d73a010fc0a34376482d2f320d12051c3065: Status 404 returned error can't find the container with id 339f7a864986211a1b6d6f0e7c45d73a010fc0a34376482d2f320d12051c3065 Dec 05 20:39:37 crc kubenswrapper[4744]: I1205 20:39:37.318616 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-dhq9h"] Dec 05 20:39:37 crc kubenswrapper[4744]: I1205 20:39:37.978836 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:38 crc kubenswrapper[4744]: I1205 20:39:38.161825 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-dhq9h" event={"ID":"1d27f0f9-e34f-43db-b882-3b3b3609d961","Type":"ContainerStarted","Data":"339f7a864986211a1b6d6f0e7c45d73a010fc0a34376482d2f320d12051c3065"} Dec 05 20:39:39 crc kubenswrapper[4744]: I1205 20:39:39.173011 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2552a5f8-aaad-4da8-b439-6e20032e5a54","Type":"ContainerStarted","Data":"f0522620c8ea2f142a841313769e7a94e6044636f9e9163a0d3c8ad50e39fcc1"} Dec 05 20:39:39 crc kubenswrapper[4744]: I1205 20:39:39.173376 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:39:39 crc kubenswrapper[4744]: I1205 20:39:39.192738 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.138402969 podStartE2EDuration="5.192722028s" podCreationTimestamp="2025-12-05 20:39:34 +0000 UTC" firstStartedPulling="2025-12-05 20:39:35.100277295 +0000 UTC m=+1745.330088663" lastFinishedPulling="2025-12-05 20:39:38.154596354 +0000 UTC m=+1748.384407722" observedRunningTime="2025-12-05 20:39:39.191941281 +0000 UTC m=+1749.421752659" watchObservedRunningTime="2025-12-05 20:39:39.192722028 +0000 UTC m=+1749.422533396" Dec 05 20:39:39 crc kubenswrapper[4744]: I1205 20:39:39.238400 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:40 crc kubenswrapper[4744]: I1205 20:39:40.414711 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:41 crc kubenswrapper[4744]: I1205 20:39:41.636092 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:42 crc kubenswrapper[4744]: I1205 20:39:42.831495 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:44 crc kubenswrapper[4744]: I1205 20:39:44.017558 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:44 crc kubenswrapper[4744]: I1205 20:39:44.080847 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:39:44 crc kubenswrapper[4744]: E1205 20:39:44.081061 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:39:45 crc kubenswrapper[4744]: I1205 20:39:45.218755 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:46 crc kubenswrapper[4744]: I1205 20:39:46.471853 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:47 crc kubenswrapper[4744]: I1205 20:39:47.725736 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:48 crc kubenswrapper[4744]: I1205 20:39:48.930866 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:50 crc kubenswrapper[4744]: I1205 20:39:50.135948 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:51 crc kubenswrapper[4744]: I1205 20:39:51.318488 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:52 crc kubenswrapper[4744]: I1205 20:39:52.527459 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:53 crc kubenswrapper[4744]: I1205 20:39:53.313952 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-dhq9h" event={"ID":"1d27f0f9-e34f-43db-b882-3b3b3609d961","Type":"ContainerStarted","Data":"f46954bf1567479f870d7f703ec1b67f2fc610ccfbc2abd51cd18993eb1f56e1"} Dec 05 20:39:53 crc kubenswrapper[4744]: I1205 20:39:53.342390 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-db-sync-dhq9h" podStartSLOduration=2.100274888 podStartE2EDuration="17.342374656s" podCreationTimestamp="2025-12-05 20:39:36 +0000 UTC" firstStartedPulling="2025-12-05 20:39:37.320131335 +0000 UTC m=+1747.549942703" lastFinishedPulling="2025-12-05 20:39:52.562231073 +0000 UTC m=+1762.792042471" observedRunningTime="2025-12-05 20:39:53.336590988 +0000 UTC m=+1763.566402356" watchObservedRunningTime="2025-12-05 20:39:53.342374656 +0000 UTC m=+1763.572186024" Dec 05 20:39:53 crc kubenswrapper[4744]: I1205 20:39:53.703204 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:54 crc kubenswrapper[4744]: I1205 20:39:54.906835 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:56 crc kubenswrapper[4744]: I1205 20:39:56.080072 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:39:56 crc kubenswrapper[4744]: E1205 20:39:56.080602 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:39:56 crc kubenswrapper[4744]: I1205 20:39:56.097800 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:57 crc kubenswrapper[4744]: I1205 20:39:57.321366 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:58 crc kubenswrapper[4744]: I1205 20:39:58.504640 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:39:59 crc kubenswrapper[4744]: I1205 20:39:59.384144 4744 generic.go:334] "Generic (PLEG): container finished" podID="1d27f0f9-e34f-43db-b882-3b3b3609d961" containerID="f46954bf1567479f870d7f703ec1b67f2fc610ccfbc2abd51cd18993eb1f56e1" exitCode=0 Dec 05 20:39:59 crc kubenswrapper[4744]: I1205 20:39:59.384192 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-dhq9h" event={"ID":"1d27f0f9-e34f-43db-b882-3b3b3609d961","Type":"ContainerDied","Data":"f46954bf1567479f870d7f703ec1b67f2fc610ccfbc2abd51cd18993eb1f56e1"} Dec 05 20:39:59 crc kubenswrapper[4744]: I1205 20:39:59.759156 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:00 crc kubenswrapper[4744]: I1205 20:40:00.768376 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:40:00 crc kubenswrapper[4744]: I1205 20:40:00.949159 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-combined-ca-bundle\") pod \"1d27f0f9-e34f-43db-b882-3b3b3609d961\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " Dec 05 20:40:00 crc kubenswrapper[4744]: I1205 20:40:00.949266 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-scripts\") pod \"1d27f0f9-e34f-43db-b882-3b3b3609d961\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " Dec 05 20:40:00 crc kubenswrapper[4744]: I1205 20:40:00.949449 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfmkv\" (UniqueName: \"kubernetes.io/projected/1d27f0f9-e34f-43db-b882-3b3b3609d961-kube-api-access-sfmkv\") pod \"1d27f0f9-e34f-43db-b882-3b3b3609d961\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " Dec 05 20:40:00 crc kubenswrapper[4744]: I1205 20:40:00.949489 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-db-sync-config-data\") pod \"1d27f0f9-e34f-43db-b882-3b3b3609d961\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " Dec 05 20:40:00 crc kubenswrapper[4744]: I1205 20:40:00.949539 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d27f0f9-e34f-43db-b882-3b3b3609d961-etc-machine-id\") pod \"1d27f0f9-e34f-43db-b882-3b3b3609d961\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " Dec 05 20:40:00 crc kubenswrapper[4744]: I1205 20:40:00.949560 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-config-data\") pod \"1d27f0f9-e34f-43db-b882-3b3b3609d961\" (UID: \"1d27f0f9-e34f-43db-b882-3b3b3609d961\") " Dec 05 20:40:00 crc kubenswrapper[4744]: I1205 20:40:00.949814 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d27f0f9-e34f-43db-b882-3b3b3609d961-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1d27f0f9-e34f-43db-b882-3b3b3609d961" (UID: "1d27f0f9-e34f-43db-b882-3b3b3609d961"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:00 crc kubenswrapper[4744]: I1205 20:40:00.950085 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d27f0f9-e34f-43db-b882-3b3b3609d961-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:00 crc kubenswrapper[4744]: I1205 20:40:00.955020 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1d27f0f9-e34f-43db-b882-3b3b3609d961" (UID: "1d27f0f9-e34f-43db-b882-3b3b3609d961"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:00 crc kubenswrapper[4744]: I1205 20:40:00.956124 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-scripts" (OuterVolumeSpecName: "scripts") pod "1d27f0f9-e34f-43db-b882-3b3b3609d961" (UID: "1d27f0f9-e34f-43db-b882-3b3b3609d961"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:00 crc kubenswrapper[4744]: I1205 20:40:00.964393 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:00 crc kubenswrapper[4744]: I1205 20:40:00.967966 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d27f0f9-e34f-43db-b882-3b3b3609d961-kube-api-access-sfmkv" (OuterVolumeSpecName: "kube-api-access-sfmkv") pod "1d27f0f9-e34f-43db-b882-3b3b3609d961" (UID: "1d27f0f9-e34f-43db-b882-3b3b3609d961"). InnerVolumeSpecName "kube-api-access-sfmkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:00 crc kubenswrapper[4744]: I1205 20:40:00.996904 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d27f0f9-e34f-43db-b882-3b3b3609d961" (UID: "1d27f0f9-e34f-43db-b882-3b3b3609d961"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.002981 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-config-data" (OuterVolumeSpecName: "config-data") pod "1d27f0f9-e34f-43db-b882-3b3b3609d961" (UID: "1d27f0f9-e34f-43db-b882-3b3b3609d961"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.051789 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.051816 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.051825 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.051833 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfmkv\" (UniqueName: \"kubernetes.io/projected/1d27f0f9-e34f-43db-b882-3b3b3609d961-kube-api-access-sfmkv\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.051841 4744 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1d27f0f9-e34f-43db-b882-3b3b3609d961-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.463364 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-dhq9h" event={"ID":"1d27f0f9-e34f-43db-b882-3b3b3609d961","Type":"ContainerDied","Data":"339f7a864986211a1b6d6f0e7c45d73a010fc0a34376482d2f320d12051c3065"} Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.463426 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="339f7a864986211a1b6d6f0e7c45d73a010fc0a34376482d2f320d12051c3065" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.463462 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-dhq9h" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.817424 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 05 20:40:01 crc kubenswrapper[4744]: E1205 20:40:01.817743 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d27f0f9-e34f-43db-b882-3b3b3609d961" containerName="cinder-db-sync" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.817755 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d27f0f9-e34f-43db-b882-3b3b3609d961" containerName="cinder-db-sync" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.817913 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d27f0f9-e34f-43db-b882-3b3b3609d961" containerName="cinder-db-sync" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.818949 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.821856 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-cinder-dockercfg-dv7z5" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.822089 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-config-data" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.822232 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scheduler-config-data" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.823808 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scripts" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.842584 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.854861 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.861912 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.867386 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-backup-config-data" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.870427 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.966451 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-run\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.966488 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.966513 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-config-data-custom\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.966530 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-scripts\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.966549 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvfmr\" (UniqueName: \"kubernetes.io/projected/97855898-4f59-40d1-b087-1ce5af03dad6-kube-api-access-qvfmr\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.966686 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-scripts\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.966747 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.966838 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.966921 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.966966 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgr6t\" (UniqueName: \"kubernetes.io/projected/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-kube-api-access-lgr6t\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.967003 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.967029 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-config-data\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.967083 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-dev\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.967177 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.967236 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.967264 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.967325 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-config-data\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.967361 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.967376 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.967393 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.967413 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-nvme\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.967485 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-lib-modules\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:01 crc kubenswrapper[4744]: I1205 20:40:01.967499 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-sys\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.023971 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.026107 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.030578 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-api-config-data" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.052959 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075277 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075412 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075448 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-config-data\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075472 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075486 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075503 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075517 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-nvme\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075549 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-lib-modules\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075563 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-sys\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075590 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-run\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075605 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075633 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-config-data-custom\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075650 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-scripts\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075669 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvfmr\" (UniqueName: \"kubernetes.io/projected/97855898-4f59-40d1-b087-1ce5af03dad6-kube-api-access-qvfmr\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075698 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-scripts\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075720 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075751 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075789 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075806 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgr6t\" (UniqueName: \"kubernetes.io/projected/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-kube-api-access-lgr6t\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075823 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-config-data\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075839 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075870 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-dev\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.075901 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.077600 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.077667 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.096130 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-config-data\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.096221 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.098038 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-lib-modules\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.098160 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.098233 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-nvme\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.101717 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.101772 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-sys\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.101801 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-run\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.102169 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.110436 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.111365 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.111519 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.111653 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.111788 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-dev\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.117420 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-scripts\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.117776 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.118322 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-config-data\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.123719 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-scripts\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.135273 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-config-data-custom\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.157037 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgr6t\" (UniqueName: \"kubernetes.io/projected/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-kube-api-access-lgr6t\") pod \"cinder-scheduler-0\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.173942 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvfmr\" (UniqueName: \"kubernetes.io/projected/97855898-4f59-40d1-b087-1ce5af03dad6-kube-api-access-qvfmr\") pod \"cinder-backup-0\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.177338 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.177433 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-config-data-custom\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.177452 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5018300a-b041-46cb-a989-0a490d4029c1-logs\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.177658 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-config-data\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.177768 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrt95\" (UniqueName: \"kubernetes.io/projected/5018300a-b041-46cb-a989-0a490d4029c1-kube-api-access-mrt95\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.177882 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5018300a-b041-46cb-a989-0a490d4029c1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.177966 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-scripts\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.177993 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.187631 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.188160 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.279467 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-scripts\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.279786 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.279886 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.279918 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-config-data-custom\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.279933 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5018300a-b041-46cb-a989-0a490d4029c1-logs\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.279969 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-config-data\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.280004 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrt95\" (UniqueName: \"kubernetes.io/projected/5018300a-b041-46cb-a989-0a490d4029c1-kube-api-access-mrt95\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.280039 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5018300a-b041-46cb-a989-0a490d4029c1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.280150 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5018300a-b041-46cb-a989-0a490d4029c1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.285092 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5018300a-b041-46cb-a989-0a490d4029c1-logs\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.285751 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-scripts\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.288941 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.290566 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-config-data\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.291128 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-config-data-custom\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.295092 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.306883 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrt95\" (UniqueName: \"kubernetes.io/projected/5018300a-b041-46cb-a989-0a490d4029c1-kube-api-access-mrt95\") pod \"cinder-api-0\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.347350 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.433725 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.795389 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.840296 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 05 20:40:02 crc kubenswrapper[4744]: W1205 20:40:02.841942 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94c4d668_1c4d_42d0_b244_d4baf8d9eb85.slice/crio-80afe946e24e206e389cbc0697bb23ccf2c8d207f46e36b962e42c4014ac4881 WatchSource:0}: Error finding container 80afe946e24e206e389cbc0697bb23ccf2c8d207f46e36b962e42c4014ac4881: Status 404 returned error can't find the container with id 80afe946e24e206e389cbc0697bb23ccf2c8d207f46e36b962e42c4014ac4881 Dec 05 20:40:02 crc kubenswrapper[4744]: W1205 20:40:02.878630 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5018300a_b041_46cb_a989_0a490d4029c1.slice/crio-f844fcb8833a1516638e872a5fec4d815eb0f573cf5517cb5a6204cda317d6ac WatchSource:0}: Error finding container f844fcb8833a1516638e872a5fec4d815eb0f573cf5517cb5a6204cda317d6ac: Status 404 returned error can't find the container with id f844fcb8833a1516638e872a5fec4d815eb0f573cf5517cb5a6204cda317d6ac Dec 05 20:40:02 crc kubenswrapper[4744]: I1205 20:40:02.880663 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 05 20:40:03 crc kubenswrapper[4744]: I1205 20:40:03.406983 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:03 crc kubenswrapper[4744]: I1205 20:40:03.540169 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"94c4d668-1c4d-42d0-b244-d4baf8d9eb85","Type":"ContainerStarted","Data":"80afe946e24e206e389cbc0697bb23ccf2c8d207f46e36b962e42c4014ac4881"} Dec 05 20:40:03 crc kubenswrapper[4744]: I1205 20:40:03.541179 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"5018300a-b041-46cb-a989-0a490d4029c1","Type":"ContainerStarted","Data":"f844fcb8833a1516638e872a5fec4d815eb0f573cf5517cb5a6204cda317d6ac"} Dec 05 20:40:03 crc kubenswrapper[4744]: I1205 20:40:03.543628 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"97855898-4f59-40d1-b087-1ce5af03dad6","Type":"ContainerStarted","Data":"8564c3f81ee0a509ef8403f8b34c07b4cfd353988076d6d6a850b5a0df8b0596"} Dec 05 20:40:04 crc kubenswrapper[4744]: I1205 20:40:04.295141 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 05 20:40:04 crc kubenswrapper[4744]: I1205 20:40:04.571746 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"97855898-4f59-40d1-b087-1ce5af03dad6","Type":"ContainerStarted","Data":"d93f93952de791e8f231610a4b44146881ce83903d5a35c4a922058eba7c4c37"} Dec 05 20:40:04 crc kubenswrapper[4744]: I1205 20:40:04.571974 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"97855898-4f59-40d1-b087-1ce5af03dad6","Type":"ContainerStarted","Data":"70443408b5fdf4484de26c05f580c671cc4ed32ea46353b696414d800af59d58"} Dec 05 20:40:04 crc kubenswrapper[4744]: I1205 20:40:04.579137 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"94c4d668-1c4d-42d0-b244-d4baf8d9eb85","Type":"ContainerStarted","Data":"0492aef81bd36b7224c1a8dba1ce6cd987e23c470a0c89fe86fdea2e0d3b103b"} Dec 05 20:40:04 crc kubenswrapper[4744]: I1205 20:40:04.588190 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:04 crc kubenswrapper[4744]: I1205 20:40:04.593835 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:04 crc kubenswrapper[4744]: I1205 20:40:04.602267 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"5018300a-b041-46cb-a989-0a490d4029c1","Type":"ContainerStarted","Data":"49c772b2b0a59e4ba233aa4092b0308331956ad080d21acd1aea3b9b8ba3c7bb"} Dec 05 20:40:04 crc kubenswrapper[4744]: I1205 20:40:04.602326 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"5018300a-b041-46cb-a989-0a490d4029c1","Type":"ContainerStarted","Data":"9877224925f8cbe7b1efbe46d8e52936e3cdcf8750e2192c5e94395351f2d5c9"} Dec 05 20:40:04 crc kubenswrapper[4744]: I1205 20:40:04.602445 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-api-0" podUID="5018300a-b041-46cb-a989-0a490d4029c1" containerName="cinder-api-log" containerID="cri-o://9877224925f8cbe7b1efbe46d8e52936e3cdcf8750e2192c5e94395351f2d5c9" gracePeriod=30 Dec 05 20:40:04 crc kubenswrapper[4744]: I1205 20:40:04.602544 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:04 crc kubenswrapper[4744]: I1205 20:40:04.602576 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-api-0" podUID="5018300a-b041-46cb-a989-0a490d4029c1" containerName="cinder-api" containerID="cri-o://49c772b2b0a59e4ba233aa4092b0308331956ad080d21acd1aea3b9b8ba3c7bb" gracePeriod=30 Dec 05 20:40:04 crc kubenswrapper[4744]: I1205 20:40:04.609566 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-backup-0" podStartSLOduration=2.768756262 podStartE2EDuration="3.609547432s" podCreationTimestamp="2025-12-05 20:40:01 +0000 UTC" firstStartedPulling="2025-12-05 20:40:02.802470944 +0000 UTC m=+1773.032282312" lastFinishedPulling="2025-12-05 20:40:03.643262114 +0000 UTC m=+1773.873073482" observedRunningTime="2025-12-05 20:40:04.602888975 +0000 UTC m=+1774.832700343" watchObservedRunningTime="2025-12-05 20:40:04.609547432 +0000 UTC m=+1774.839358800" Dec 05 20:40:04 crc kubenswrapper[4744]: I1205 20:40:04.678699 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-api-0" podStartSLOduration=3.6786826379999997 podStartE2EDuration="3.678682638s" podCreationTimestamp="2025-12-05 20:40:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:40:04.675788694 +0000 UTC m=+1774.905600072" watchObservedRunningTime="2025-12-05 20:40:04.678682638 +0000 UTC m=+1774.908494006" Dec 05 20:40:05 crc kubenswrapper[4744]: I1205 20:40:05.611382 4744 generic.go:334] "Generic (PLEG): container finished" podID="5018300a-b041-46cb-a989-0a490d4029c1" containerID="9877224925f8cbe7b1efbe46d8e52936e3cdcf8750e2192c5e94395351f2d5c9" exitCode=143 Dec 05 20:40:05 crc kubenswrapper[4744]: I1205 20:40:05.612669 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"5018300a-b041-46cb-a989-0a490d4029c1","Type":"ContainerDied","Data":"9877224925f8cbe7b1efbe46d8e52936e3cdcf8750e2192c5e94395351f2d5c9"} Dec 05 20:40:05 crc kubenswrapper[4744]: I1205 20:40:05.839985 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:07 crc kubenswrapper[4744]: I1205 20:40:07.074588 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:07 crc kubenswrapper[4744]: I1205 20:40:07.192129 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:08 crc kubenswrapper[4744]: I1205 20:40:08.283097 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:08 crc kubenswrapper[4744]: I1205 20:40:08.638133 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"94c4d668-1c4d-42d0-b244-d4baf8d9eb85","Type":"ContainerStarted","Data":"e1e282bd6aec043f9ae00b0981a0db329cfe88a861a2deab9ff16907f7b92f09"} Dec 05 20:40:08 crc kubenswrapper[4744]: I1205 20:40:08.672646 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-scheduler-0" podStartSLOduration=6.874316407 podStartE2EDuration="7.672628263s" podCreationTimestamp="2025-12-05 20:40:01 +0000 UTC" firstStartedPulling="2025-12-05 20:40:02.8437292 +0000 UTC m=+1773.073540568" lastFinishedPulling="2025-12-05 20:40:03.642041066 +0000 UTC m=+1773.871852424" observedRunningTime="2025-12-05 20:40:08.667446618 +0000 UTC m=+1778.897258036" watchObservedRunningTime="2025-12-05 20:40:08.672628263 +0000 UTC m=+1778.902439641" Dec 05 20:40:09 crc kubenswrapper[4744]: I1205 20:40:09.471066 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:10 crc kubenswrapper[4744]: I1205 20:40:10.090465 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:40:10 crc kubenswrapper[4744]: E1205 20:40:10.091037 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:40:10 crc kubenswrapper[4744]: I1205 20:40:10.688672 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:11 crc kubenswrapper[4744]: I1205 20:40:11.934962 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:12 crc kubenswrapper[4744]: I1205 20:40:12.399713 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:12 crc kubenswrapper[4744]: I1205 20:40:12.434754 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:12 crc kubenswrapper[4744]: I1205 20:40:12.442746 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 05 20:40:12 crc kubenswrapper[4744]: I1205 20:40:12.676675 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="97855898-4f59-40d1-b087-1ce5af03dad6" containerName="cinder-backup" containerID="cri-o://70443408b5fdf4484de26c05f580c671cc4ed32ea46353b696414d800af59d58" gracePeriod=30 Dec 05 20:40:12 crc kubenswrapper[4744]: I1205 20:40:12.676770 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="97855898-4f59-40d1-b087-1ce5af03dad6" containerName="probe" containerID="cri-o://d93f93952de791e8f231610a4b44146881ce83903d5a35c4a922058eba7c4c37" gracePeriod=30 Dec 05 20:40:12 crc kubenswrapper[4744]: I1205 20:40:12.696836 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:12 crc kubenswrapper[4744]: I1205 20:40:12.734836 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 05 20:40:13 crc kubenswrapper[4744]: I1205 20:40:13.120363 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:13 crc kubenswrapper[4744]: I1205 20:40:13.700327 4744 generic.go:334] "Generic (PLEG): container finished" podID="97855898-4f59-40d1-b087-1ce5af03dad6" containerID="d93f93952de791e8f231610a4b44146881ce83903d5a35c4a922058eba7c4c37" exitCode=0 Dec 05 20:40:13 crc kubenswrapper[4744]: I1205 20:40:13.700420 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"97855898-4f59-40d1-b087-1ce5af03dad6","Type":"ContainerDied","Data":"d93f93952de791e8f231610a4b44146881ce83903d5a35c4a922058eba7c4c37"} Dec 05 20:40:13 crc kubenswrapper[4744]: I1205 20:40:13.700534 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="94c4d668-1c4d-42d0-b244-d4baf8d9eb85" containerName="cinder-scheduler" containerID="cri-o://0492aef81bd36b7224c1a8dba1ce6cd987e23c470a0c89fe86fdea2e0d3b103b" gracePeriod=30 Dec 05 20:40:13 crc kubenswrapper[4744]: I1205 20:40:13.700855 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="94c4d668-1c4d-42d0-b244-d4baf8d9eb85" containerName="probe" containerID="cri-o://e1e282bd6aec043f9ae00b0981a0db329cfe88a861a2deab9ff16907f7b92f09" gracePeriod=30 Dec 05 20:40:13 crc kubenswrapper[4744]: I1205 20:40:13.890930 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:40:13 crc kubenswrapper[4744]: I1205 20:40:13.891150 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="ea3a6e04-6c1e-4661-914a-be0fb1ea8792" containerName="watcher-decision-engine" containerID="cri-o://07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7" gracePeriod=30 Dec 05 20:40:14 crc kubenswrapper[4744]: I1205 20:40:14.351099 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:14 crc kubenswrapper[4744]: I1205 20:40:14.394811 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:14 crc kubenswrapper[4744]: I1205 20:40:14.710193 4744 generic.go:334] "Generic (PLEG): container finished" podID="94c4d668-1c4d-42d0-b244-d4baf8d9eb85" containerID="e1e282bd6aec043f9ae00b0981a0db329cfe88a861a2deab9ff16907f7b92f09" exitCode=0 Dec 05 20:40:14 crc kubenswrapper[4744]: I1205 20:40:14.710242 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"94c4d668-1c4d-42d0-b244-d4baf8d9eb85","Type":"ContainerDied","Data":"e1e282bd6aec043f9ae00b0981a0db329cfe88a861a2deab9ff16907f7b92f09"} Dec 05 20:40:14 crc kubenswrapper[4744]: I1205 20:40:14.823459 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:14 crc kubenswrapper[4744]: I1205 20:40:14.823727 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="ceilometer-central-agent" containerID="cri-o://88ae3ca80b5f18b3ece09c67bb4753ba487b775cc1cacd344539b3883f77d7ec" gracePeriod=30 Dec 05 20:40:14 crc kubenswrapper[4744]: I1205 20:40:14.823876 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="proxy-httpd" containerID="cri-o://f0522620c8ea2f142a841313769e7a94e6044636f9e9163a0d3c8ad50e39fcc1" gracePeriod=30 Dec 05 20:40:14 crc kubenswrapper[4744]: I1205 20:40:14.823932 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="sg-core" containerID="cri-o://2c9ac59b54298d20ec962e28c8b555fec50245c9f212895738df6e78c933f045" gracePeriod=30 Dec 05 20:40:14 crc kubenswrapper[4744]: I1205 20:40:14.823970 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="ceilometer-notification-agent" containerID="cri-o://89e9150ad52d3798fe737ec3d53fdac3cd062f83599d7e776c0ef7ee14806f70" gracePeriod=30 Dec 05 20:40:15 crc kubenswrapper[4744]: I1205 20:40:15.598379 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:15 crc kubenswrapper[4744]: I1205 20:40:15.721054 4744 generic.go:334] "Generic (PLEG): container finished" podID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerID="f0522620c8ea2f142a841313769e7a94e6044636f9e9163a0d3c8ad50e39fcc1" exitCode=0 Dec 05 20:40:15 crc kubenswrapper[4744]: I1205 20:40:15.721086 4744 generic.go:334] "Generic (PLEG): container finished" podID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerID="2c9ac59b54298d20ec962e28c8b555fec50245c9f212895738df6e78c933f045" exitCode=2 Dec 05 20:40:15 crc kubenswrapper[4744]: I1205 20:40:15.721094 4744 generic.go:334] "Generic (PLEG): container finished" podID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerID="88ae3ca80b5f18b3ece09c67bb4753ba487b775cc1cacd344539b3883f77d7ec" exitCode=0 Dec 05 20:40:15 crc kubenswrapper[4744]: I1205 20:40:15.721102 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2552a5f8-aaad-4da8-b439-6e20032e5a54","Type":"ContainerDied","Data":"f0522620c8ea2f142a841313769e7a94e6044636f9e9163a0d3c8ad50e39fcc1"} Dec 05 20:40:15 crc kubenswrapper[4744]: I1205 20:40:15.721156 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2552a5f8-aaad-4da8-b439-6e20032e5a54","Type":"ContainerDied","Data":"2c9ac59b54298d20ec962e28c8b555fec50245c9f212895738df6e78c933f045"} Dec 05 20:40:15 crc kubenswrapper[4744]: I1205 20:40:15.721173 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2552a5f8-aaad-4da8-b439-6e20032e5a54","Type":"ContainerDied","Data":"88ae3ca80b5f18b3ece09c67bb4753ba487b775cc1cacd344539b3883f77d7ec"} Dec 05 20:40:16 crc kubenswrapper[4744]: I1205 20:40:16.819854 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.354994 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.497022 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-config-data\") pod \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.497092 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-combined-ca-bundle\") pod \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.497125 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-config-data-custom\") pod \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.497251 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgr6t\" (UniqueName: \"kubernetes.io/projected/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-kube-api-access-lgr6t\") pod \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.497302 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-scripts\") pod \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.497317 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-etc-machine-id\") pod \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.497337 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-cert-memcached-mtls\") pod \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\" (UID: \"94c4d668-1c4d-42d0-b244-d4baf8d9eb85\") " Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.497799 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "94c4d668-1c4d-42d0-b244-d4baf8d9eb85" (UID: "94c4d668-1c4d-42d0-b244-d4baf8d9eb85"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.502720 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-scripts" (OuterVolumeSpecName: "scripts") pod "94c4d668-1c4d-42d0-b244-d4baf8d9eb85" (UID: "94c4d668-1c4d-42d0-b244-d4baf8d9eb85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.521584 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-kube-api-access-lgr6t" (OuterVolumeSpecName: "kube-api-access-lgr6t") pod "94c4d668-1c4d-42d0-b244-d4baf8d9eb85" (UID: "94c4d668-1c4d-42d0-b244-d4baf8d9eb85"). InnerVolumeSpecName "kube-api-access-lgr6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.528683 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "94c4d668-1c4d-42d0-b244-d4baf8d9eb85" (UID: "94c4d668-1c4d-42d0-b244-d4baf8d9eb85"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.558760 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94c4d668-1c4d-42d0-b244-d4baf8d9eb85" (UID: "94c4d668-1c4d-42d0-b244-d4baf8d9eb85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.599009 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.599035 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgr6t\" (UniqueName: \"kubernetes.io/projected/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-kube-api-access-lgr6t\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.599046 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.599055 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.599062 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.601204 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-config-data" (OuterVolumeSpecName: "config-data") pod "94c4d668-1c4d-42d0-b244-d4baf8d9eb85" (UID: "94c4d668-1c4d-42d0-b244-d4baf8d9eb85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.677434 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "94c4d668-1c4d-42d0-b244-d4baf8d9eb85" (UID: "94c4d668-1c4d-42d0-b244-d4baf8d9eb85"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.702656 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.702704 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c4d668-1c4d-42d0-b244-d4baf8d9eb85-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.764329 4744 generic.go:334] "Generic (PLEG): container finished" podID="94c4d668-1c4d-42d0-b244-d4baf8d9eb85" containerID="0492aef81bd36b7224c1a8dba1ce6cd987e23c470a0c89fe86fdea2e0d3b103b" exitCode=0 Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.764407 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"94c4d668-1c4d-42d0-b244-d4baf8d9eb85","Type":"ContainerDied","Data":"0492aef81bd36b7224c1a8dba1ce6cd987e23c470a0c89fe86fdea2e0d3b103b"} Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.764430 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.764448 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"94c4d668-1c4d-42d0-b244-d4baf8d9eb85","Type":"ContainerDied","Data":"80afe946e24e206e389cbc0697bb23ccf2c8d207f46e36b962e42c4014ac4881"} Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.764470 4744 scope.go:117] "RemoveContainer" containerID="e1e282bd6aec043f9ae00b0981a0db329cfe88a861a2deab9ff16907f7b92f09" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.769274 4744 generic.go:334] "Generic (PLEG): container finished" podID="97855898-4f59-40d1-b087-1ce5af03dad6" containerID="70443408b5fdf4484de26c05f580c671cc4ed32ea46353b696414d800af59d58" exitCode=0 Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.769331 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"97855898-4f59-40d1-b087-1ce5af03dad6","Type":"ContainerDied","Data":"70443408b5fdf4484de26c05f580c671cc4ed32ea46353b696414d800af59d58"} Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.804208 4744 scope.go:117] "RemoveContainer" containerID="0492aef81bd36b7224c1a8dba1ce6cd987e23c470a0c89fe86fdea2e0d3b103b" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.811383 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.833544 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.843405 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 05 20:40:17 crc kubenswrapper[4744]: E1205 20:40:17.843732 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c4d668-1c4d-42d0-b244-d4baf8d9eb85" containerName="cinder-scheduler" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.843744 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c4d668-1c4d-42d0-b244-d4baf8d9eb85" containerName="cinder-scheduler" Dec 05 20:40:17 crc kubenswrapper[4744]: E1205 20:40:17.843756 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c4d668-1c4d-42d0-b244-d4baf8d9eb85" containerName="probe" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.843762 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c4d668-1c4d-42d0-b244-d4baf8d9eb85" containerName="probe" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.843910 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c4d668-1c4d-42d0-b244-d4baf8d9eb85" containerName="probe" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.843929 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c4d668-1c4d-42d0-b244-d4baf8d9eb85" containerName="cinder-scheduler" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.844743 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.860553 4744 scope.go:117] "RemoveContainer" containerID="e1e282bd6aec043f9ae00b0981a0db329cfe88a861a2deab9ff16907f7b92f09" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.860737 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scheduler-config-data" Dec 05 20:40:17 crc kubenswrapper[4744]: E1205 20:40:17.861306 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1e282bd6aec043f9ae00b0981a0db329cfe88a861a2deab9ff16907f7b92f09\": container with ID starting with e1e282bd6aec043f9ae00b0981a0db329cfe88a861a2deab9ff16907f7b92f09 not found: ID does not exist" containerID="e1e282bd6aec043f9ae00b0981a0db329cfe88a861a2deab9ff16907f7b92f09" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.861329 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e282bd6aec043f9ae00b0981a0db329cfe88a861a2deab9ff16907f7b92f09"} err="failed to get container status \"e1e282bd6aec043f9ae00b0981a0db329cfe88a861a2deab9ff16907f7b92f09\": rpc error: code = NotFound desc = could not find container \"e1e282bd6aec043f9ae00b0981a0db329cfe88a861a2deab9ff16907f7b92f09\": container with ID starting with e1e282bd6aec043f9ae00b0981a0db329cfe88a861a2deab9ff16907f7b92f09 not found: ID does not exist" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.861347 4744 scope.go:117] "RemoveContainer" containerID="0492aef81bd36b7224c1a8dba1ce6cd987e23c470a0c89fe86fdea2e0d3b103b" Dec 05 20:40:17 crc kubenswrapper[4744]: E1205 20:40:17.865668 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0492aef81bd36b7224c1a8dba1ce6cd987e23c470a0c89fe86fdea2e0d3b103b\": container with ID starting with 0492aef81bd36b7224c1a8dba1ce6cd987e23c470a0c89fe86fdea2e0d3b103b not found: ID does not exist" containerID="0492aef81bd36b7224c1a8dba1ce6cd987e23c470a0c89fe86fdea2e0d3b103b" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.865718 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0492aef81bd36b7224c1a8dba1ce6cd987e23c470a0c89fe86fdea2e0d3b103b"} err="failed to get container status \"0492aef81bd36b7224c1a8dba1ce6cd987e23c470a0c89fe86fdea2e0d3b103b\": rpc error: code = NotFound desc = could not find container \"0492aef81bd36b7224c1a8dba1ce6cd987e23c470a0c89fe86fdea2e0d3b103b\": container with ID starting with 0492aef81bd36b7224c1a8dba1ce6cd987e23c470a0c89fe86fdea2e0d3b103b not found: ID does not exist" Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.886032 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 05 20:40:17 crc kubenswrapper[4744]: I1205 20:40:17.907417 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.000694 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ea3a6e04-6c1e-4661-914a-be0fb1ea8792/watcher-decision-engine/0.log" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.007549 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.007592 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-config-data\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.007636 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.008461 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.008753 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-scripts\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.008916 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xff9p\" (UniqueName: \"kubernetes.io/projected/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-kube-api-access-xff9p\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.009053 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.089499 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94c4d668-1c4d-42d0-b244-d4baf8d9eb85" path="/var/lib/kubelet/pods/94c4d668-1c4d-42d0-b244-d4baf8d9eb85/volumes" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110332 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-combined-ca-bundle\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110378 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-iscsi\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110395 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-nvme\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110427 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-locks-cinder\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110450 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-run\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110496 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110538 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110510 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-machine-id\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110569 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110588 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110605 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-run" (OuterVolumeSpecName: "run") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110608 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-dev\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110689 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-dev" (OuterVolumeSpecName: "dev") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110692 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-locks-brick\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110721 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110743 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-config-data-custom\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110761 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-lib-cinder\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110785 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-config-data\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110816 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-sys\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110834 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-cert-memcached-mtls\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110858 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvfmr\" (UniqueName: \"kubernetes.io/projected/97855898-4f59-40d1-b087-1ce5af03dad6-kube-api-access-qvfmr\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110888 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-lib-modules\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.110975 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-scripts\") pod \"97855898-4f59-40d1-b087-1ce5af03dad6\" (UID: \"97855898-4f59-40d1-b087-1ce5af03dad6\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111190 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-scripts\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111240 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xff9p\" (UniqueName: \"kubernetes.io/projected/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-kube-api-access-xff9p\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111280 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111336 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111351 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-config-data\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111382 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111404 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111490 4744 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111500 4744 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111509 4744 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111518 4744 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-run\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111527 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111535 4744 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-dev\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111544 4744 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111829 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111923 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.111923 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-sys" (OuterVolumeSpecName: "sys") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.112425 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.115877 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97855898-4f59-40d1-b087-1ce5af03dad6-kube-api-access-qvfmr" (OuterVolumeSpecName: "kube-api-access-qvfmr") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "kube-api-access-qvfmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.118525 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-scripts\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.120506 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-config-data\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.122504 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.126663 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.126744 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.126772 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-scripts" (OuterVolumeSpecName: "scripts") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.131869 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xff9p\" (UniqueName: \"kubernetes.io/projected/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-kube-api-access-xff9p\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.132197 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.169447 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.175760 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.208225 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-config-data" (OuterVolumeSpecName: "config-data") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.212747 4744 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.212779 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.212788 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.212797 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.212804 4744 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.212812 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.212822 4744 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/97855898-4f59-40d1-b087-1ce5af03dad6-sys\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.212829 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvfmr\" (UniqueName: \"kubernetes.io/projected/97855898-4f59-40d1-b087-1ce5af03dad6-kube-api-access-qvfmr\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.279444 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "97855898-4f59-40d1-b087-1ce5af03dad6" (UID: "97855898-4f59-40d1-b087-1ce5af03dad6"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.314211 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/97855898-4f59-40d1-b087-1ce5af03dad6-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.654334 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 05 20:40:18 crc kubenswrapper[4744]: W1205 20:40:18.658090 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d9c66b1_f531_4967_86ae_287a1ce3a1c7.slice/crio-7551a65a7e7ba8d2eebc7578d80a518cbd6a761efbc22312ab844d4c5581dede WatchSource:0}: Error finding container 7551a65a7e7ba8d2eebc7578d80a518cbd6a761efbc22312ab844d4c5581dede: Status 404 returned error can't find the container with id 7551a65a7e7ba8d2eebc7578d80a518cbd6a761efbc22312ab844d4c5581dede Dec 05 20:40:18 crc kubenswrapper[4744]: E1205 20:40:18.671314 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7 is running failed: container process not found" containerID="07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 20:40:18 crc kubenswrapper[4744]: E1205 20:40:18.671776 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7 is running failed: container process not found" containerID="07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 20:40:18 crc kubenswrapper[4744]: E1205 20:40:18.672230 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7 is running failed: container process not found" containerID="07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 05 20:40:18 crc kubenswrapper[4744]: E1205 20:40:18.672268 4744 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7 is running failed: container process not found" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="ea3a6e04-6c1e-4661-914a-be0fb1ea8792" containerName="watcher-decision-engine" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.777641 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.784849 4744 generic.go:334] "Generic (PLEG): container finished" podID="ea3a6e04-6c1e-4661-914a-be0fb1ea8792" containerID="07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7" exitCode=0 Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.784927 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"ea3a6e04-6c1e-4661-914a-be0fb1ea8792","Type":"ContainerDied","Data":"07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7"} Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.784968 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"ea3a6e04-6c1e-4661-914a-be0fb1ea8792","Type":"ContainerDied","Data":"17dcd65dcf4422324515c9a90c877cb339f13f5ff5beb0e8d3c41e70f2d764d4"} Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.784993 4744 scope.go:117] "RemoveContainer" containerID="07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.785205 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.789989 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"9d9c66b1-f531-4967-86ae-287a1ce3a1c7","Type":"ContainerStarted","Data":"7551a65a7e7ba8d2eebc7578d80a518cbd6a761efbc22312ab844d4c5581dede"} Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.809443 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"97855898-4f59-40d1-b087-1ce5af03dad6","Type":"ContainerDied","Data":"8564c3f81ee0a509ef8403f8b34c07b4cfd353988076d6d6a850b5a0df8b0596"} Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.809781 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.836751 4744 scope.go:117] "RemoveContainer" containerID="07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7" Dec 05 20:40:18 crc kubenswrapper[4744]: E1205 20:40:18.837133 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7\": container with ID starting with 07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7 not found: ID does not exist" containerID="07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.837166 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7"} err="failed to get container status \"07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7\": rpc error: code = NotFound desc = could not find container \"07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7\": container with ID starting with 07f4b172c9b9a8019694dd57da349cfbabb1c449c94731aea93a0faf53aa69a7 not found: ID does not exist" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.837190 4744 scope.go:117] "RemoveContainer" containerID="d93f93952de791e8f231610a4b44146881ce83903d5a35c4a922058eba7c4c37" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.856334 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.871127 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.895731 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 05 20:40:18 crc kubenswrapper[4744]: E1205 20:40:18.896328 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3a6e04-6c1e-4661-914a-be0fb1ea8792" containerName="watcher-decision-engine" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.896352 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3a6e04-6c1e-4661-914a-be0fb1ea8792" containerName="watcher-decision-engine" Dec 05 20:40:18 crc kubenswrapper[4744]: E1205 20:40:18.896403 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97855898-4f59-40d1-b087-1ce5af03dad6" containerName="probe" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.896412 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="97855898-4f59-40d1-b087-1ce5af03dad6" containerName="probe" Dec 05 20:40:18 crc kubenswrapper[4744]: E1205 20:40:18.896433 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97855898-4f59-40d1-b087-1ce5af03dad6" containerName="cinder-backup" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.896441 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="97855898-4f59-40d1-b087-1ce5af03dad6" containerName="cinder-backup" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.896688 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="97855898-4f59-40d1-b087-1ce5af03dad6" containerName="probe" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.896737 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3a6e04-6c1e-4661-914a-be0fb1ea8792" containerName="watcher-decision-engine" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.896752 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="97855898-4f59-40d1-b087-1ce5af03dad6" containerName="cinder-backup" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.897885 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.899964 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-backup-config-data" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.902507 4744 scope.go:117] "RemoveContainer" containerID="70443408b5fdf4484de26c05f580c671cc4ed32ea46353b696414d800af59d58" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.905306 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.920747 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-combined-ca-bundle\") pod \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.920793 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ww5h\" (UniqueName: \"kubernetes.io/projected/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-kube-api-access-5ww5h\") pod \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.920866 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-cert-memcached-mtls\") pod \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.920894 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-logs\") pod \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.920947 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-custom-prometheus-ca\") pod \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.921029 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-config-data\") pod \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\" (UID: \"ea3a6e04-6c1e-4661-914a-be0fb1ea8792\") " Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.921610 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-logs" (OuterVolumeSpecName: "logs") pod "ea3a6e04-6c1e-4661-914a-be0fb1ea8792" (UID: "ea3a6e04-6c1e-4661-914a-be0fb1ea8792"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.928634 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-kube-api-access-5ww5h" (OuterVolumeSpecName: "kube-api-access-5ww5h") pod "ea3a6e04-6c1e-4661-914a-be0fb1ea8792" (UID: "ea3a6e04-6c1e-4661-914a-be0fb1ea8792"). InnerVolumeSpecName "kube-api-access-5ww5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.968409 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "ea3a6e04-6c1e-4661-914a-be0fb1ea8792" (UID: "ea3a6e04-6c1e-4661-914a-be0fb1ea8792"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.968655 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea3a6e04-6c1e-4661-914a-be0fb1ea8792" (UID: "ea3a6e04-6c1e-4661-914a-be0fb1ea8792"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:18 crc kubenswrapper[4744]: I1205 20:40:18.989180 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-config-data" (OuterVolumeSpecName: "config-data") pod "ea3a6e04-6c1e-4661-914a-be0fb1ea8792" (UID: "ea3a6e04-6c1e-4661-914a-be0fb1ea8792"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.023259 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-config-data\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.023789 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-dev\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.023812 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.023886 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.023959 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-scripts\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.023981 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.024007 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-run\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.024040 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.024066 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.024084 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-lib-modules\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.024127 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.024162 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-sys\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.024183 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5fk2\" (UniqueName: \"kubernetes.io/projected/e85ac7b0-b06b-4f8e-8330-088bb19d433f-kube-api-access-b5fk2\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.024206 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.024881 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.024936 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.025018 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.025041 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ww5h\" (UniqueName: \"kubernetes.io/projected/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-kube-api-access-5ww5h\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.025055 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.025066 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.025076 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.048043 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "ea3a6e04-6c1e-4661-914a-be0fb1ea8792" (UID: "ea3a6e04-6c1e-4661-914a-be0fb1ea8792"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.125954 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-scripts\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.125991 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126015 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-run\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126041 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126067 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126087 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-lib-modules\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126116 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126144 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-sys\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126167 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5fk2\" (UniqueName: \"kubernetes.io/projected/e85ac7b0-b06b-4f8e-8330-088bb19d433f-kube-api-access-b5fk2\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126188 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126214 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126241 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126282 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-config-data\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126320 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-dev\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126337 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126363 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126437 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.126747 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ea3a6e04-6c1e-4661-914a-be0fb1ea8792-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.127172 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-sys\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.127206 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.127230 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-lib-modules\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.127264 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.127312 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.127324 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.127374 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-run\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.127422 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.127444 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-dev\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.130946 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.132353 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-scripts\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.135283 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.135768 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-config-data\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.138657 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.141189 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.146869 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.151026 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5fk2\" (UniqueName: \"kubernetes.io/projected/e85ac7b0-b06b-4f8e-8330-088bb19d433f-kube-api-access-b5fk2\") pod \"cinder-backup-0\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.167040 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.168392 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.172530 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.185721 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.228378 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.228435 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.228506 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.228908 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.229542 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9km96\" (UniqueName: \"kubernetes.io/projected/e1cfd875-844b-4246-b4fd-9286f7f4ca81-kube-api-access-9km96\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.229594 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1cfd875-844b-4246-b4fd-9286f7f4ca81-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.229658 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.331109 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9km96\" (UniqueName: \"kubernetes.io/projected/e1cfd875-844b-4246-b4fd-9286f7f4ca81-kube-api-access-9km96\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.331162 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1cfd875-844b-4246-b4fd-9286f7f4ca81-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.331200 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.331268 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.331333 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.331372 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.332114 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1cfd875-844b-4246-b4fd-9286f7f4ca81-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.339450 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.339605 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.344396 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.344460 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.354010 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9km96\" (UniqueName: \"kubernetes.io/projected/e1cfd875-844b-4246-b4fd-9286f7f4ca81-kube-api-access-9km96\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.489615 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.647255 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.785764 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 05 20:40:19 crc kubenswrapper[4744]: W1205 20:40:19.789485 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode85ac7b0_b06b_4f8e_8330_088bb19d433f.slice/crio-e0f37cf2aa3da8c391472b8c9774becb4ddeb00154167f2a365aa4dd5c0c52c0 WatchSource:0}: Error finding container e0f37cf2aa3da8c391472b8c9774becb4ddeb00154167f2a365aa4dd5c0c52c0: Status 404 returned error can't find the container with id e0f37cf2aa3da8c391472b8c9774becb4ddeb00154167f2a365aa4dd5c0c52c0 Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.844958 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-config-data\") pod \"2552a5f8-aaad-4da8-b439-6e20032e5a54\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.844998 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-combined-ca-bundle\") pod \"2552a5f8-aaad-4da8-b439-6e20032e5a54\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.845039 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2552a5f8-aaad-4da8-b439-6e20032e5a54-log-httpd\") pod \"2552a5f8-aaad-4da8-b439-6e20032e5a54\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.845055 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-ceilometer-tls-certs\") pod \"2552a5f8-aaad-4da8-b439-6e20032e5a54\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.845091 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vddx\" (UniqueName: \"kubernetes.io/projected/2552a5f8-aaad-4da8-b439-6e20032e5a54-kube-api-access-7vddx\") pod \"2552a5f8-aaad-4da8-b439-6e20032e5a54\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.845175 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-sg-core-conf-yaml\") pod \"2552a5f8-aaad-4da8-b439-6e20032e5a54\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.845314 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-scripts\") pod \"2552a5f8-aaad-4da8-b439-6e20032e5a54\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.845347 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2552a5f8-aaad-4da8-b439-6e20032e5a54-run-httpd\") pod \"2552a5f8-aaad-4da8-b439-6e20032e5a54\" (UID: \"2552a5f8-aaad-4da8-b439-6e20032e5a54\") " Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.847789 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2552a5f8-aaad-4da8-b439-6e20032e5a54-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2552a5f8-aaad-4da8-b439-6e20032e5a54" (UID: "2552a5f8-aaad-4da8-b439-6e20032e5a54"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.848883 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2552a5f8-aaad-4da8-b439-6e20032e5a54-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2552a5f8-aaad-4da8-b439-6e20032e5a54" (UID: "2552a5f8-aaad-4da8-b439-6e20032e5a54"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.851616 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2552a5f8-aaad-4da8-b439-6e20032e5a54-kube-api-access-7vddx" (OuterVolumeSpecName: "kube-api-access-7vddx") pod "2552a5f8-aaad-4da8-b439-6e20032e5a54" (UID: "2552a5f8-aaad-4da8-b439-6e20032e5a54"). InnerVolumeSpecName "kube-api-access-7vddx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.852450 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-scripts" (OuterVolumeSpecName: "scripts") pod "2552a5f8-aaad-4da8-b439-6e20032e5a54" (UID: "2552a5f8-aaad-4da8-b439-6e20032e5a54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.883644 4744 generic.go:334] "Generic (PLEG): container finished" podID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerID="89e9150ad52d3798fe737ec3d53fdac3cd062f83599d7e776c0ef7ee14806f70" exitCode=0 Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.883744 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2552a5f8-aaad-4da8-b439-6e20032e5a54","Type":"ContainerDied","Data":"89e9150ad52d3798fe737ec3d53fdac3cd062f83599d7e776c0ef7ee14806f70"} Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.883774 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2552a5f8-aaad-4da8-b439-6e20032e5a54","Type":"ContainerDied","Data":"10c4a471b2997f135ff62c0fd91bfb7fefe3d927909404ccf6d80ddd44f63acd"} Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.883792 4744 scope.go:117] "RemoveContainer" containerID="f0522620c8ea2f142a841313769e7a94e6044636f9e9163a0d3c8ad50e39fcc1" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.883906 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.898517 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2552a5f8-aaad-4da8-b439-6e20032e5a54" (UID: "2552a5f8-aaad-4da8-b439-6e20032e5a54"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.912171 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"9d9c66b1-f531-4967-86ae-287a1ce3a1c7","Type":"ContainerStarted","Data":"b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94"} Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.918905 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"e85ac7b0-b06b-4f8e-8330-088bb19d433f","Type":"ContainerStarted","Data":"e0f37cf2aa3da8c391472b8c9774becb4ddeb00154167f2a365aa4dd5c0c52c0"} Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.932379 4744 scope.go:117] "RemoveContainer" containerID="2c9ac59b54298d20ec962e28c8b555fec50245c9f212895738df6e78c933f045" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.937503 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2552a5f8-aaad-4da8-b439-6e20032e5a54" (UID: "2552a5f8-aaad-4da8-b439-6e20032e5a54"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.939664 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2552a5f8-aaad-4da8-b439-6e20032e5a54" (UID: "2552a5f8-aaad-4da8-b439-6e20032e5a54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.948119 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.948142 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vddx\" (UniqueName: \"kubernetes.io/projected/2552a5f8-aaad-4da8-b439-6e20032e5a54-kube-api-access-7vddx\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.948152 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.948161 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.948170 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2552a5f8-aaad-4da8-b439-6e20032e5a54-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.948179 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:19 crc kubenswrapper[4744]: I1205 20:40:19.948186 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2552a5f8-aaad-4da8-b439-6e20032e5a54-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.006701 4744 scope.go:117] "RemoveContainer" containerID="89e9150ad52d3798fe737ec3d53fdac3cd062f83599d7e776c0ef7ee14806f70" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.027845 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-config-data" (OuterVolumeSpecName: "config-data") pod "2552a5f8-aaad-4da8-b439-6e20032e5a54" (UID: "2552a5f8-aaad-4da8-b439-6e20032e5a54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.044723 4744 scope.go:117] "RemoveContainer" containerID="88ae3ca80b5f18b3ece09c67bb4753ba487b775cc1cacd344539b3883f77d7ec" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.047650 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.049606 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2552a5f8-aaad-4da8-b439-6e20032e5a54-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:20 crc kubenswrapper[4744]: W1205 20:40:20.049688 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1cfd875_844b_4246_b4fd_9286f7f4ca81.slice/crio-09219520111b2defa005eda395f59200d6d6c7f8c61783e6af876aa2d9e836b6 WatchSource:0}: Error finding container 09219520111b2defa005eda395f59200d6d6c7f8c61783e6af876aa2d9e836b6: Status 404 returned error can't find the container with id 09219520111b2defa005eda395f59200d6d6c7f8c61783e6af876aa2d9e836b6 Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.068681 4744 scope.go:117] "RemoveContainer" containerID="f0522620c8ea2f142a841313769e7a94e6044636f9e9163a0d3c8ad50e39fcc1" Dec 05 20:40:20 crc kubenswrapper[4744]: E1205 20:40:20.069377 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0522620c8ea2f142a841313769e7a94e6044636f9e9163a0d3c8ad50e39fcc1\": container with ID starting with f0522620c8ea2f142a841313769e7a94e6044636f9e9163a0d3c8ad50e39fcc1 not found: ID does not exist" containerID="f0522620c8ea2f142a841313769e7a94e6044636f9e9163a0d3c8ad50e39fcc1" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.069426 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0522620c8ea2f142a841313769e7a94e6044636f9e9163a0d3c8ad50e39fcc1"} err="failed to get container status \"f0522620c8ea2f142a841313769e7a94e6044636f9e9163a0d3c8ad50e39fcc1\": rpc error: code = NotFound desc = could not find container \"f0522620c8ea2f142a841313769e7a94e6044636f9e9163a0d3c8ad50e39fcc1\": container with ID starting with f0522620c8ea2f142a841313769e7a94e6044636f9e9163a0d3c8ad50e39fcc1 not found: ID does not exist" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.069464 4744 scope.go:117] "RemoveContainer" containerID="2c9ac59b54298d20ec962e28c8b555fec50245c9f212895738df6e78c933f045" Dec 05 20:40:20 crc kubenswrapper[4744]: E1205 20:40:20.070239 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c9ac59b54298d20ec962e28c8b555fec50245c9f212895738df6e78c933f045\": container with ID starting with 2c9ac59b54298d20ec962e28c8b555fec50245c9f212895738df6e78c933f045 not found: ID does not exist" containerID="2c9ac59b54298d20ec962e28c8b555fec50245c9f212895738df6e78c933f045" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.070279 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c9ac59b54298d20ec962e28c8b555fec50245c9f212895738df6e78c933f045"} err="failed to get container status \"2c9ac59b54298d20ec962e28c8b555fec50245c9f212895738df6e78c933f045\": rpc error: code = NotFound desc = could not find container \"2c9ac59b54298d20ec962e28c8b555fec50245c9f212895738df6e78c933f045\": container with ID starting with 2c9ac59b54298d20ec962e28c8b555fec50245c9f212895738df6e78c933f045 not found: ID does not exist" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.070320 4744 scope.go:117] "RemoveContainer" containerID="89e9150ad52d3798fe737ec3d53fdac3cd062f83599d7e776c0ef7ee14806f70" Dec 05 20:40:20 crc kubenswrapper[4744]: E1205 20:40:20.070595 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89e9150ad52d3798fe737ec3d53fdac3cd062f83599d7e776c0ef7ee14806f70\": container with ID starting with 89e9150ad52d3798fe737ec3d53fdac3cd062f83599d7e776c0ef7ee14806f70 not found: ID does not exist" containerID="89e9150ad52d3798fe737ec3d53fdac3cd062f83599d7e776c0ef7ee14806f70" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.070626 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e9150ad52d3798fe737ec3d53fdac3cd062f83599d7e776c0ef7ee14806f70"} err="failed to get container status \"89e9150ad52d3798fe737ec3d53fdac3cd062f83599d7e776c0ef7ee14806f70\": rpc error: code = NotFound desc = could not find container \"89e9150ad52d3798fe737ec3d53fdac3cd062f83599d7e776c0ef7ee14806f70\": container with ID starting with 89e9150ad52d3798fe737ec3d53fdac3cd062f83599d7e776c0ef7ee14806f70 not found: ID does not exist" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.070687 4744 scope.go:117] "RemoveContainer" containerID="88ae3ca80b5f18b3ece09c67bb4753ba487b775cc1cacd344539b3883f77d7ec" Dec 05 20:40:20 crc kubenswrapper[4744]: E1205 20:40:20.071355 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88ae3ca80b5f18b3ece09c67bb4753ba487b775cc1cacd344539b3883f77d7ec\": container with ID starting with 88ae3ca80b5f18b3ece09c67bb4753ba487b775cc1cacd344539b3883f77d7ec not found: ID does not exist" containerID="88ae3ca80b5f18b3ece09c67bb4753ba487b775cc1cacd344539b3883f77d7ec" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.071383 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88ae3ca80b5f18b3ece09c67bb4753ba487b775cc1cacd344539b3883f77d7ec"} err="failed to get container status \"88ae3ca80b5f18b3ece09c67bb4753ba487b775cc1cacd344539b3883f77d7ec\": rpc error: code = NotFound desc = could not find container \"88ae3ca80b5f18b3ece09c67bb4753ba487b775cc1cacd344539b3883f77d7ec\": container with ID starting with 88ae3ca80b5f18b3ece09c67bb4753ba487b775cc1cacd344539b3883f77d7ec not found: ID does not exist" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.094133 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97855898-4f59-40d1-b087-1ce5af03dad6" path="/var/lib/kubelet/pods/97855898-4f59-40d1-b087-1ce5af03dad6/volumes" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.094896 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea3a6e04-6c1e-4661-914a-be0fb1ea8792" path="/var/lib/kubelet/pods/ea3a6e04-6c1e-4661-914a-be0fb1ea8792/volumes" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.224537 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.229470 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.268367 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:20 crc kubenswrapper[4744]: E1205 20:40:20.268926 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="ceilometer-notification-agent" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.268994 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="ceilometer-notification-agent" Dec 05 20:40:20 crc kubenswrapper[4744]: E1205 20:40:20.269010 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="proxy-httpd" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.269017 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="proxy-httpd" Dec 05 20:40:20 crc kubenswrapper[4744]: E1205 20:40:20.269028 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="ceilometer-central-agent" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.269034 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="ceilometer-central-agent" Dec 05 20:40:20 crc kubenswrapper[4744]: E1205 20:40:20.269046 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="sg-core" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.269051 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="sg-core" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.283443 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="sg-core" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.283478 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="proxy-httpd" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.283514 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="ceilometer-notification-agent" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.283530 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" containerName="ceilometer-central-agent" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.285055 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.285160 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.287830 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.287978 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.292403 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.360314 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1865381-500a-4338-b035-3b2ad20bacb7-log-httpd\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.360364 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-scripts\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.360390 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.360427 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.374560 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n94lw\" (UniqueName: \"kubernetes.io/projected/b1865381-500a-4338-b035-3b2ad20bacb7-kube-api-access-n94lw\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.374700 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1865381-500a-4338-b035-3b2ad20bacb7-run-httpd\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.374752 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.374771 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-config-data\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.476186 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.476228 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-config-data\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.476255 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1865381-500a-4338-b035-3b2ad20bacb7-log-httpd\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.476281 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-scripts\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.476320 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.476353 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.476390 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n94lw\" (UniqueName: \"kubernetes.io/projected/b1865381-500a-4338-b035-3b2ad20bacb7-kube-api-access-n94lw\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.476436 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1865381-500a-4338-b035-3b2ad20bacb7-run-httpd\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.476860 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1865381-500a-4338-b035-3b2ad20bacb7-run-httpd\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.478350 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1865381-500a-4338-b035-3b2ad20bacb7-log-httpd\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.487320 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-scripts\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.487995 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-config-data\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.488020 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.491831 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.493875 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.501840 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n94lw\" (UniqueName: \"kubernetes.io/projected/b1865381-500a-4338-b035-3b2ad20bacb7-kube-api-access-n94lw\") pod \"ceilometer-0\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.602956 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.959923 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"e85ac7b0-b06b-4f8e-8330-088bb19d433f","Type":"ContainerStarted","Data":"951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c"} Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.960318 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"e85ac7b0-b06b-4f8e-8330-088bb19d433f","Type":"ContainerStarted","Data":"7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33"} Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.966480 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"9d9c66b1-f531-4967-86ae-287a1ce3a1c7","Type":"ContainerStarted","Data":"8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890"} Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.968469 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"e1cfd875-844b-4246-b4fd-9286f7f4ca81","Type":"ContainerStarted","Data":"72e13954de7b622fa9b513e1852f69c6f62f1d2b7725b073bb14936cb988e468"} Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.968493 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"e1cfd875-844b-4246-b4fd-9286f7f4ca81","Type":"ContainerStarted","Data":"09219520111b2defa005eda395f59200d6d6c7f8c61783e6af876aa2d9e836b6"} Dec 05 20:40:20 crc kubenswrapper[4744]: I1205 20:40:20.981598 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-backup-0" podStartSLOduration=2.981584245 podStartE2EDuration="2.981584245s" podCreationTimestamp="2025-12-05 20:40:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:40:20.979980066 +0000 UTC m=+1791.209791434" watchObservedRunningTime="2025-12-05 20:40:20.981584245 +0000 UTC m=+1791.211395613" Dec 05 20:40:21 crc kubenswrapper[4744]: I1205 20:40:21.004766 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-scheduler-0" podStartSLOduration=4.004748712 podStartE2EDuration="4.004748712s" podCreationTimestamp="2025-12-05 20:40:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:40:21.001228216 +0000 UTC m=+1791.231039594" watchObservedRunningTime="2025-12-05 20:40:21.004748712 +0000 UTC m=+1791.234560080" Dec 05 20:40:21 crc kubenswrapper[4744]: I1205 20:40:21.016301 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.016262183 podStartE2EDuration="2.016262183s" podCreationTimestamp="2025-12-05 20:40:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:40:21.013724341 +0000 UTC m=+1791.243535709" watchObservedRunningTime="2025-12-05 20:40:21.016262183 +0000 UTC m=+1791.246073551" Dec 05 20:40:21 crc kubenswrapper[4744]: I1205 20:40:21.063054 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:21 crc kubenswrapper[4744]: I1205 20:40:21.524704 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:21 crc kubenswrapper[4744]: I1205 20:40:21.983428 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1865381-500a-4338-b035-3b2ad20bacb7","Type":"ContainerStarted","Data":"246b91873d98658315f4cb599ab83bfe269c08f82fa1ccb960a632a65e2f8161"} Dec 05 20:40:21 crc kubenswrapper[4744]: I1205 20:40:21.983818 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1865381-500a-4338-b035-3b2ad20bacb7","Type":"ContainerStarted","Data":"3e5ff9046352cf3793beaa233f390123bde31f84a16bad561e50841bbccb5587"} Dec 05 20:40:22 crc kubenswrapper[4744]: I1205 20:40:22.081765 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:40:22 crc kubenswrapper[4744]: E1205 20:40:22.082046 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:40:22 crc kubenswrapper[4744]: I1205 20:40:22.089744 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2552a5f8-aaad-4da8-b439-6e20032e5a54" path="/var/lib/kubelet/pods/2552a5f8-aaad-4da8-b439-6e20032e5a54/volumes" Dec 05 20:40:22 crc kubenswrapper[4744]: I1205 20:40:22.678644 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:22 crc kubenswrapper[4744]: I1205 20:40:22.993543 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1865381-500a-4338-b035-3b2ad20bacb7","Type":"ContainerStarted","Data":"5d73214a3642252484df6ce52dacebae3c72ebc72fd64b160d8b1c3e2ce84296"} Dec 05 20:40:23 crc kubenswrapper[4744]: I1205 20:40:23.177014 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:23 crc kubenswrapper[4744]: I1205 20:40:23.857375 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:24 crc kubenswrapper[4744]: I1205 20:40:24.003276 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1865381-500a-4338-b035-3b2ad20bacb7","Type":"ContainerStarted","Data":"1467df7c76cb2807243c1d1ac62c74a3774d7572933e4f11c1fe28e63cf17bc0"} Dec 05 20:40:24 crc kubenswrapper[4744]: I1205 20:40:24.230062 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:25 crc kubenswrapper[4744]: I1205 20:40:25.099602 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:26 crc kubenswrapper[4744]: I1205 20:40:26.023279 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1865381-500a-4338-b035-3b2ad20bacb7","Type":"ContainerStarted","Data":"01c01d346ca2c807d91471dd8ce7d99511b3eda18615b3868f36f8866a5d34e6"} Dec 05 20:40:26 crc kubenswrapper[4744]: I1205 20:40:26.023518 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:26 crc kubenswrapper[4744]: I1205 20:40:26.043731 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.333042749 podStartE2EDuration="6.043712634s" podCreationTimestamp="2025-12-05 20:40:20 +0000 UTC" firstStartedPulling="2025-12-05 20:40:21.077511039 +0000 UTC m=+1791.307322407" lastFinishedPulling="2025-12-05 20:40:24.788180894 +0000 UTC m=+1795.017992292" observedRunningTime="2025-12-05 20:40:26.041145701 +0000 UTC m=+1796.270957069" watchObservedRunningTime="2025-12-05 20:40:26.043712634 +0000 UTC m=+1796.273524002" Dec 05 20:40:26 crc kubenswrapper[4744]: I1205 20:40:26.358841 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:27 crc kubenswrapper[4744]: I1205 20:40:27.592062 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:28 crc kubenswrapper[4744]: I1205 20:40:28.374076 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:28 crc kubenswrapper[4744]: I1205 20:40:28.789362 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:29 crc kubenswrapper[4744]: I1205 20:40:29.444235 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:29 crc kubenswrapper[4744]: I1205 20:40:29.490335 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:29 crc kubenswrapper[4744]: I1205 20:40:29.533486 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:30 crc kubenswrapper[4744]: I1205 20:40:30.013010 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:30 crc kubenswrapper[4744]: I1205 20:40:30.067967 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:30 crc kubenswrapper[4744]: I1205 20:40:30.113186 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.259857 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.515685 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.619936 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-dhq9h"] Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.632876 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-dhq9h"] Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.644233 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.644484 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="9d9c66b1-f531-4967-86ae-287a1ce3a1c7" containerName="cinder-scheduler" containerID="cri-o://b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94" gracePeriod=30 Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.644854 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="9d9c66b1-f531-4967-86ae-287a1ce3a1c7" containerName="probe" containerID="cri-o://8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890" gracePeriod=30 Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.656546 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.656785 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="e85ac7b0-b06b-4f8e-8330-088bb19d433f" containerName="cinder-backup" containerID="cri-o://7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33" gracePeriod=30 Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.656916 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="e85ac7b0-b06b-4f8e-8330-088bb19d433f" containerName="probe" containerID="cri-o://951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c" gracePeriod=30 Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.704090 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder2955-account-delete-jntzk"] Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.708862 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder2955-account-delete-jntzk" Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.722000 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder2955-account-delete-jntzk"] Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.792525 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwv5l\" (UniqueName: \"kubernetes.io/projected/9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0-kube-api-access-nwv5l\") pod \"cinder2955-account-delete-jntzk\" (UID: \"9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0\") " pod="watcher-kuttl-default/cinder2955-account-delete-jntzk" Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.792614 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0-operator-scripts\") pod \"cinder2955-account-delete-jntzk\" (UID: \"9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0\") " pod="watcher-kuttl-default/cinder2955-account-delete-jntzk" Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.893733 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwv5l\" (UniqueName: \"kubernetes.io/projected/9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0-kube-api-access-nwv5l\") pod \"cinder2955-account-delete-jntzk\" (UID: \"9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0\") " pod="watcher-kuttl-default/cinder2955-account-delete-jntzk" Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.893835 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0-operator-scripts\") pod \"cinder2955-account-delete-jntzk\" (UID: \"9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0\") " pod="watcher-kuttl-default/cinder2955-account-delete-jntzk" Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.894593 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0-operator-scripts\") pod \"cinder2955-account-delete-jntzk\" (UID: \"9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0\") " pod="watcher-kuttl-default/cinder2955-account-delete-jntzk" Dec 05 20:40:31 crc kubenswrapper[4744]: I1205 20:40:31.929096 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwv5l\" (UniqueName: \"kubernetes.io/projected/9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0-kube-api-access-nwv5l\") pod \"cinder2955-account-delete-jntzk\" (UID: \"9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0\") " pod="watcher-kuttl-default/cinder2955-account-delete-jntzk" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.058395 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder2955-account-delete-jntzk" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.089770 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d27f0f9-e34f-43db-b882-3b3b3609d961" path="/var/lib/kubelet/pods/1d27f0f9-e34f-43db-b882-3b3b3609d961/volumes" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.568159 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder2955-account-delete-jntzk"] Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.742631 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.754485 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.812992 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-config-data\") pod \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.813375 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-combined-ca-bundle\") pod \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.813412 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-config-data-custom\") pod \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.813494 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xff9p\" (UniqueName: \"kubernetes.io/projected/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-kube-api-access-xff9p\") pod \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.813538 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-scripts\") pod \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.813575 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-etc-machine-id\") pod \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.813609 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-cert-memcached-mtls\") pod \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\" (UID: \"9d9c66b1-f531-4967-86ae-287a1ce3a1c7\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.815969 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9d9c66b1-f531-4967-86ae-287a1ce3a1c7" (UID: "9d9c66b1-f531-4967-86ae-287a1ce3a1c7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.820266 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-scripts" (OuterVolumeSpecName: "scripts") pod "9d9c66b1-f531-4967-86ae-287a1ce3a1c7" (UID: "9d9c66b1-f531-4967-86ae-287a1ce3a1c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.820525 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-kube-api-access-xff9p" (OuterVolumeSpecName: "kube-api-access-xff9p") pod "9d9c66b1-f531-4967-86ae-287a1ce3a1c7" (UID: "9d9c66b1-f531-4967-86ae-287a1ce3a1c7"). InnerVolumeSpecName "kube-api-access-xff9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.834275 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9d9c66b1-f531-4967-86ae-287a1ce3a1c7" (UID: "9d9c66b1-f531-4967-86ae-287a1ce3a1c7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.853929 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.875460 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d9c66b1-f531-4967-86ae-287a1ce3a1c7" (UID: "9d9c66b1-f531-4967-86ae-287a1ce3a1c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.916623 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5fk2\" (UniqueName: \"kubernetes.io/projected/e85ac7b0-b06b-4f8e-8330-088bb19d433f-kube-api-access-b5fk2\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.917375 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-dev\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.918234 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-machine-id\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.918390 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-config-data-custom\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.918504 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-combined-ca-bundle\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919135 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-iscsi\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.917457 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-dev" (OuterVolumeSpecName: "dev") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.918355 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919209 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919232 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-lib-modules\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919407 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-cert-memcached-mtls\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919438 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-sys\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919468 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-nvme\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919512 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-sys" (OuterVolumeSpecName: "sys") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919540 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-config-data\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919556 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-lib-cinder\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919573 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-locks-cinder\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919589 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-scripts\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919613 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-run\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919652 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-locks-brick\") pod \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\" (UID: \"e85ac7b0-b06b-4f8e-8330-088bb19d433f\") " Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919555 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919867 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.919887 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.920059 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-run" (OuterVolumeSpecName: "run") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.920165 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.920184 4744 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.920195 4744 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.920205 4744 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-dev\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.920214 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.920222 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.920231 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.920239 4744 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.920246 4744 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-sys\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.920254 4744 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.920263 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xff9p\" (UniqueName: \"kubernetes.io/projected/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-kube-api-access-xff9p\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.920274 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.920386 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.920833 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.922510 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e85ac7b0-b06b-4f8e-8330-088bb19d433f-kube-api-access-b5fk2" (OuterVolumeSpecName: "kube-api-access-b5fk2") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "kube-api-access-b5fk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.923509 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.925567 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-scripts" (OuterVolumeSpecName: "scripts") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.976542 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-config-data" (OuterVolumeSpecName: "config-data") pod "9d9c66b1-f531-4967-86ae-287a1ce3a1c7" (UID: "9d9c66b1-f531-4967-86ae-287a1ce3a1c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.985426 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:32 crc kubenswrapper[4744]: I1205 20:40:32.991933 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "9d9c66b1-f531-4967-86ae-287a1ce3a1c7" (UID: "9d9c66b1-f531-4967-86ae-287a1ce3a1c7"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.004995 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-config-data" (OuterVolumeSpecName: "config-data") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.021600 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.021782 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.021844 4744 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-run\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.021896 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.021948 4744 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.021999 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5fk2\" (UniqueName: \"kubernetes.io/projected/e85ac7b0-b06b-4f8e-8330-088bb19d433f-kube-api-access-b5fk2\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.022050 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.022104 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d9c66b1-f531-4967-86ae-287a1ce3a1c7-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.022167 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.022218 4744 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e85ac7b0-b06b-4f8e-8330-088bb19d433f-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.091442 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "e85ac7b0-b06b-4f8e-8330-088bb19d433f" (UID: "e85ac7b0-b06b-4f8e-8330-088bb19d433f"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.095150 4744 generic.go:334] "Generic (PLEG): container finished" podID="9d9c66b1-f531-4967-86ae-287a1ce3a1c7" containerID="8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890" exitCode=0 Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.095272 4744 generic.go:334] "Generic (PLEG): container finished" podID="9d9c66b1-f531-4967-86ae-287a1ce3a1c7" containerID="b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94" exitCode=0 Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.095392 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"9d9c66b1-f531-4967-86ae-287a1ce3a1c7","Type":"ContainerDied","Data":"8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890"} Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.095766 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"9d9c66b1-f531-4967-86ae-287a1ce3a1c7","Type":"ContainerDied","Data":"b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94"} Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.095785 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"9d9c66b1-f531-4967-86ae-287a1ce3a1c7","Type":"ContainerDied","Data":"7551a65a7e7ba8d2eebc7578d80a518cbd6a761efbc22312ab844d4c5581dede"} Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.095801 4744 scope.go:117] "RemoveContainer" containerID="8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.095507 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.097451 4744 generic.go:334] "Generic (PLEG): container finished" podID="9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0" containerID="d751b18d85435dbf5e8cd72e0534fdc77f901a8d0abbc55f803a111f4594df75" exitCode=0 Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.097544 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder2955-account-delete-jntzk" event={"ID":"9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0","Type":"ContainerDied","Data":"d751b18d85435dbf5e8cd72e0534fdc77f901a8d0abbc55f803a111f4594df75"} Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.097607 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder2955-account-delete-jntzk" event={"ID":"9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0","Type":"ContainerStarted","Data":"8084eddccbeae4f45fd852023bc09b11069ab9019a66ca7f1d766e2440d15743"} Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.099495 4744 generic.go:334] "Generic (PLEG): container finished" podID="e85ac7b0-b06b-4f8e-8330-088bb19d433f" containerID="951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c" exitCode=0 Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.099671 4744 generic.go:334] "Generic (PLEG): container finished" podID="e85ac7b0-b06b-4f8e-8330-088bb19d433f" containerID="7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33" exitCode=0 Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.099636 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"e85ac7b0-b06b-4f8e-8330-088bb19d433f","Type":"ContainerDied","Data":"951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c"} Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.099834 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"e85ac7b0-b06b-4f8e-8330-088bb19d433f","Type":"ContainerDied","Data":"7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33"} Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.099905 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"e85ac7b0-b06b-4f8e-8330-088bb19d433f","Type":"ContainerDied","Data":"e0f37cf2aa3da8c391472b8c9774becb4ddeb00154167f2a365aa4dd5c0c52c0"} Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.099634 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.130773 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e85ac7b0-b06b-4f8e-8330-088bb19d433f-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.203531 4744 scope.go:117] "RemoveContainer" containerID="b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.220469 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.230223 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.230658 4744 scope.go:117] "RemoveContainer" containerID="8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890" Dec 05 20:40:33 crc kubenswrapper[4744]: E1205 20:40:33.231230 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890\": container with ID starting with 8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890 not found: ID does not exist" containerID="8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.231276 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890"} err="failed to get container status \"8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890\": rpc error: code = NotFound desc = could not find container \"8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890\": container with ID starting with 8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890 not found: ID does not exist" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.231354 4744 scope.go:117] "RemoveContainer" containerID="b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94" Dec 05 20:40:33 crc kubenswrapper[4744]: E1205 20:40:33.231683 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94\": container with ID starting with b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94 not found: ID does not exist" containerID="b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.231726 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94"} err="failed to get container status \"b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94\": rpc error: code = NotFound desc = could not find container \"b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94\": container with ID starting with b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94 not found: ID does not exist" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.231760 4744 scope.go:117] "RemoveContainer" containerID="8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.233484 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890"} err="failed to get container status \"8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890\": rpc error: code = NotFound desc = could not find container \"8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890\": container with ID starting with 8d6b87953265015d839e802741019edf3714ea664d9c0c3a10db022132c34890 not found: ID does not exist" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.233524 4744 scope.go:117] "RemoveContainer" containerID="b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.233922 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94"} err="failed to get container status \"b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94\": rpc error: code = NotFound desc = could not find container \"b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94\": container with ID starting with b2bc9517dc3f86770ccca8ceea54fe05b252249b30b568410b1b6a018f8f2f94 not found: ID does not exist" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.233948 4744 scope.go:117] "RemoveContainer" containerID="951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.250332 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.255448 4744 scope.go:117] "RemoveContainer" containerID="7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.256427 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.276306 4744 scope.go:117] "RemoveContainer" containerID="951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c" Dec 05 20:40:33 crc kubenswrapper[4744]: E1205 20:40:33.276751 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c\": container with ID starting with 951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c not found: ID does not exist" containerID="951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.276780 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c"} err="failed to get container status \"951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c\": rpc error: code = NotFound desc = could not find container \"951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c\": container with ID starting with 951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c not found: ID does not exist" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.276798 4744 scope.go:117] "RemoveContainer" containerID="7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33" Dec 05 20:40:33 crc kubenswrapper[4744]: E1205 20:40:33.277301 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33\": container with ID starting with 7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33 not found: ID does not exist" containerID="7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.277323 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33"} err="failed to get container status \"7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33\": rpc error: code = NotFound desc = could not find container \"7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33\": container with ID starting with 7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33 not found: ID does not exist" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.277343 4744 scope.go:117] "RemoveContainer" containerID="951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.277540 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c"} err="failed to get container status \"951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c\": rpc error: code = NotFound desc = could not find container \"951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c\": container with ID starting with 951f17de3608510e4a5160a16bc961cd40c5cf5082cd658b7897f639f15e955c not found: ID does not exist" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.277565 4744 scope.go:117] "RemoveContainer" containerID="7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.277831 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33"} err="failed to get container status \"7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33\": rpc error: code = NotFound desc = could not find container \"7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33\": container with ID starting with 7e335b8d944e373351c4b928c8140f371488ca67c450def88d61e554e0a57d33 not found: ID does not exist" Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.751550 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.751939 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="e1cfd875-844b-4246-b4fd-9286f7f4ca81" containerName="watcher-decision-engine" containerID="cri-o://72e13954de7b622fa9b513e1852f69c6f62f1d2b7725b073bb14936cb988e468" gracePeriod=30 Dec 05 20:40:33 crc kubenswrapper[4744]: I1205 20:40:33.945060 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:34 crc kubenswrapper[4744]: I1205 20:40:34.095164 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d9c66b1-f531-4967-86ae-287a1ce3a1c7" path="/var/lib/kubelet/pods/9d9c66b1-f531-4967-86ae-287a1ce3a1c7/volumes" Dec 05 20:40:34 crc kubenswrapper[4744]: I1205 20:40:34.096256 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e85ac7b0-b06b-4f8e-8330-088bb19d433f" path="/var/lib/kubelet/pods/e85ac7b0-b06b-4f8e-8330-088bb19d433f/volumes" Dec 05 20:40:34 crc kubenswrapper[4744]: I1205 20:40:34.410967 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:34 crc kubenswrapper[4744]: I1205 20:40:34.411385 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="ceilometer-central-agent" containerID="cri-o://246b91873d98658315f4cb599ab83bfe269c08f82fa1ccb960a632a65e2f8161" gracePeriod=30 Dec 05 20:40:34 crc kubenswrapper[4744]: I1205 20:40:34.411929 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="proxy-httpd" containerID="cri-o://01c01d346ca2c807d91471dd8ce7d99511b3eda18615b3868f36f8866a5d34e6" gracePeriod=30 Dec 05 20:40:34 crc kubenswrapper[4744]: I1205 20:40:34.412044 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="sg-core" containerID="cri-o://1467df7c76cb2807243c1d1ac62c74a3774d7572933e4f11c1fe28e63cf17bc0" gracePeriod=30 Dec 05 20:40:34 crc kubenswrapper[4744]: I1205 20:40:34.412337 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="ceilometer-notification-agent" containerID="cri-o://5d73214a3642252484df6ce52dacebae3c72ebc72fd64b160d8b1c3e2ce84296" gracePeriod=30 Dec 05 20:40:34 crc kubenswrapper[4744]: I1205 20:40:34.583401 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder2955-account-delete-jntzk" Dec 05 20:40:34 crc kubenswrapper[4744]: I1205 20:40:34.658967 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0-operator-scripts\") pod \"9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0\" (UID: \"9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0\") " Dec 05 20:40:34 crc kubenswrapper[4744]: I1205 20:40:34.659047 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwv5l\" (UniqueName: \"kubernetes.io/projected/9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0-kube-api-access-nwv5l\") pod \"9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0\" (UID: \"9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0\") " Dec 05 20:40:34 crc kubenswrapper[4744]: I1205 20:40:34.659878 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0" (UID: "9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:40:34 crc kubenswrapper[4744]: I1205 20:40:34.664791 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0-kube-api-access-nwv5l" (OuterVolumeSpecName: "kube-api-access-nwv5l") pod "9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0" (UID: "9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0"). InnerVolumeSpecName "kube-api-access-nwv5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:34 crc kubenswrapper[4744]: W1205 20:40:34.723795 4744 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b3f2b01_6a2c_4cec_baf1_f3ed4b1104e0.slice/crio-d751b18d85435dbf5e8cd72e0534fdc77f901a8d0abbc55f803a111f4594df75.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b3f2b01_6a2c_4cec_baf1_f3ed4b1104e0.slice/crio-d751b18d85435dbf5e8cd72e0534fdc77f901a8d0abbc55f803a111f4594df75.scope: no such file or directory Dec 05 20:40:34 crc kubenswrapper[4744]: I1205 20:40:34.761197 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:34 crc kubenswrapper[4744]: I1205 20:40:34.761236 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwv5l\" (UniqueName: \"kubernetes.io/projected/9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0-kube-api-access-nwv5l\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:34 crc kubenswrapper[4744]: E1205 20:40:34.898507 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1865381_500a_4338_b035_3b2ad20bacb7.slice/crio-246b91873d98658315f4cb599ab83bfe269c08f82fa1ccb960a632a65e2f8161.scope\": RecentStats: unable to find data in memory cache]" Dec 05 20:40:35 crc kubenswrapper[4744]: I1205 20:40:35.126573 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:35 crc kubenswrapper[4744]: I1205 20:40:35.133245 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder2955-account-delete-jntzk" event={"ID":"9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0","Type":"ContainerDied","Data":"8084eddccbeae4f45fd852023bc09b11069ab9019a66ca7f1d766e2440d15743"} Dec 05 20:40:35 crc kubenswrapper[4744]: I1205 20:40:35.133321 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8084eddccbeae4f45fd852023bc09b11069ab9019a66ca7f1d766e2440d15743" Dec 05 20:40:35 crc kubenswrapper[4744]: I1205 20:40:35.133401 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder2955-account-delete-jntzk" Dec 05 20:40:35 crc kubenswrapper[4744]: I1205 20:40:35.138655 4744 generic.go:334] "Generic (PLEG): container finished" podID="b1865381-500a-4338-b035-3b2ad20bacb7" containerID="01c01d346ca2c807d91471dd8ce7d99511b3eda18615b3868f36f8866a5d34e6" exitCode=0 Dec 05 20:40:35 crc kubenswrapper[4744]: I1205 20:40:35.138702 4744 generic.go:334] "Generic (PLEG): container finished" podID="b1865381-500a-4338-b035-3b2ad20bacb7" containerID="1467df7c76cb2807243c1d1ac62c74a3774d7572933e4f11c1fe28e63cf17bc0" exitCode=2 Dec 05 20:40:35 crc kubenswrapper[4744]: I1205 20:40:35.138758 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1865381-500a-4338-b035-3b2ad20bacb7","Type":"ContainerDied","Data":"01c01d346ca2c807d91471dd8ce7d99511b3eda18615b3868f36f8866a5d34e6"} Dec 05 20:40:35 crc kubenswrapper[4744]: I1205 20:40:35.138925 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1865381-500a-4338-b035-3b2ad20bacb7","Type":"ContainerDied","Data":"1467df7c76cb2807243c1d1ac62c74a3774d7572933e4f11c1fe28e63cf17bc0"} Dec 05 20:40:35 crc kubenswrapper[4744]: I1205 20:40:35.141952 4744 generic.go:334] "Generic (PLEG): container finished" podID="5018300a-b041-46cb-a989-0a490d4029c1" containerID="49c772b2b0a59e4ba233aa4092b0308331956ad080d21acd1aea3b9b8ba3c7bb" exitCode=137 Dec 05 20:40:35 crc kubenswrapper[4744]: I1205 20:40:35.141997 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"5018300a-b041-46cb-a989-0a490d4029c1","Type":"ContainerDied","Data":"49c772b2b0a59e4ba233aa4092b0308331956ad080d21acd1aea3b9b8ba3c7bb"} Dec 05 20:40:35 crc kubenswrapper[4744]: I1205 20:40:35.534311 4744 scope.go:117] "RemoveContainer" containerID="8efd5c115428d8639f692cebc8eae23d009bf945dc3f5a81f0247ca2b8ad33d7" Dec 05 20:40:35 crc kubenswrapper[4744]: I1205 20:40:35.554855 4744 scope.go:117] "RemoveContainer" containerID="53c6dc96f48fffaea4f1262bfbca8fe80ab2aa9fe1ace6003637c6ebc4a8b721" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.080278 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:40:36 crc kubenswrapper[4744]: E1205 20:40:36.080735 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.152908 4744 generic.go:334] "Generic (PLEG): container finished" podID="b1865381-500a-4338-b035-3b2ad20bacb7" containerID="246b91873d98658315f4cb599ab83bfe269c08f82fa1ccb960a632a65e2f8161" exitCode=0 Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.152951 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1865381-500a-4338-b035-3b2ad20bacb7","Type":"ContainerDied","Data":"246b91873d98658315f4cb599ab83bfe269c08f82fa1ccb960a632a65e2f8161"} Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.329690 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.350729 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.387335 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-config-data-custom\") pod \"5018300a-b041-46cb-a989-0a490d4029c1\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.387401 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-config-data\") pod \"5018300a-b041-46cb-a989-0a490d4029c1\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.387501 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-cert-memcached-mtls\") pod \"5018300a-b041-46cb-a989-0a490d4029c1\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.387520 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5018300a-b041-46cb-a989-0a490d4029c1-etc-machine-id\") pod \"5018300a-b041-46cb-a989-0a490d4029c1\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.387569 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5018300a-b041-46cb-a989-0a490d4029c1-logs\") pod \"5018300a-b041-46cb-a989-0a490d4029c1\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.387618 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrt95\" (UniqueName: \"kubernetes.io/projected/5018300a-b041-46cb-a989-0a490d4029c1-kube-api-access-mrt95\") pod \"5018300a-b041-46cb-a989-0a490d4029c1\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.387666 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-combined-ca-bundle\") pod \"5018300a-b041-46cb-a989-0a490d4029c1\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.387701 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-scripts\") pod \"5018300a-b041-46cb-a989-0a490d4029c1\" (UID: \"5018300a-b041-46cb-a989-0a490d4029c1\") " Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.388197 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5018300a-b041-46cb-a989-0a490d4029c1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5018300a-b041-46cb-a989-0a490d4029c1" (UID: "5018300a-b041-46cb-a989-0a490d4029c1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.392637 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5018300a-b041-46cb-a989-0a490d4029c1-logs" (OuterVolumeSpecName: "logs") pod "5018300a-b041-46cb-a989-0a490d4029c1" (UID: "5018300a-b041-46cb-a989-0a490d4029c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.393557 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5018300a-b041-46cb-a989-0a490d4029c1-kube-api-access-mrt95" (OuterVolumeSpecName: "kube-api-access-mrt95") pod "5018300a-b041-46cb-a989-0a490d4029c1" (UID: "5018300a-b041-46cb-a989-0a490d4029c1"). InnerVolumeSpecName "kube-api-access-mrt95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.394078 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5018300a-b041-46cb-a989-0a490d4029c1" (UID: "5018300a-b041-46cb-a989-0a490d4029c1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.394145 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-scripts" (OuterVolumeSpecName: "scripts") pod "5018300a-b041-46cb-a989-0a490d4029c1" (UID: "5018300a-b041-46cb-a989-0a490d4029c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.431438 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5018300a-b041-46cb-a989-0a490d4029c1" (UID: "5018300a-b041-46cb-a989-0a490d4029c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.434732 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-config-data" (OuterVolumeSpecName: "config-data") pod "5018300a-b041-46cb-a989-0a490d4029c1" (UID: "5018300a-b041-46cb-a989-0a490d4029c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.451040 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "5018300a-b041-46cb-a989-0a490d4029c1" (UID: "5018300a-b041-46cb-a989-0a490d4029c1"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.489644 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.489676 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.489685 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.489693 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.489703 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5018300a-b041-46cb-a989-0a490d4029c1-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.489712 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5018300a-b041-46cb-a989-0a490d4029c1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.489720 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5018300a-b041-46cb-a989-0a490d4029c1-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.489728 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrt95\" (UniqueName: \"kubernetes.io/projected/5018300a-b041-46cb-a989-0a490d4029c1-kube-api-access-mrt95\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.757207 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-db-create-s6pk7"] Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.768682 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-db-create-s6pk7"] Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.779747 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-2955-account-create-update-td9j2"] Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.792966 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-2955-account-create-update-td9j2"] Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.801593 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder2955-account-delete-jntzk"] Dec 05 20:40:36 crc kubenswrapper[4744]: I1205 20:40:36.809386 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder2955-account-delete-jntzk"] Dec 05 20:40:37 crc kubenswrapper[4744]: I1205 20:40:37.176004 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"5018300a-b041-46cb-a989-0a490d4029c1","Type":"ContainerDied","Data":"f844fcb8833a1516638e872a5fec4d815eb0f573cf5517cb5a6204cda317d6ac"} Dec 05 20:40:37 crc kubenswrapper[4744]: I1205 20:40:37.176059 4744 scope.go:117] "RemoveContainer" containerID="49c772b2b0a59e4ba233aa4092b0308331956ad080d21acd1aea3b9b8ba3c7bb" Dec 05 20:40:37 crc kubenswrapper[4744]: I1205 20:40:37.176223 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Dec 05 20:40:37 crc kubenswrapper[4744]: I1205 20:40:37.230387 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 05 20:40:37 crc kubenswrapper[4744]: I1205 20:40:37.231873 4744 scope.go:117] "RemoveContainer" containerID="9877224925f8cbe7b1efbe46d8e52936e3cdcf8750e2192c5e94395351f2d5c9" Dec 05 20:40:37 crc kubenswrapper[4744]: I1205 20:40:37.238215 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 05 20:40:37 crc kubenswrapper[4744]: I1205 20:40:37.554951 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:38 crc kubenswrapper[4744]: I1205 20:40:38.094431 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5018300a-b041-46cb-a989-0a490d4029c1" path="/var/lib/kubelet/pods/5018300a-b041-46cb-a989-0a490d4029c1/volumes" Dec 05 20:40:38 crc kubenswrapper[4744]: I1205 20:40:38.095969 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="505e1786-269f-46b4-a25d-5a8e41f9334b" path="/var/lib/kubelet/pods/505e1786-269f-46b4-a25d-5a8e41f9334b/volumes" Dec 05 20:40:38 crc kubenswrapper[4744]: I1205 20:40:38.096815 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0" path="/var/lib/kubelet/pods/9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0/volumes" Dec 05 20:40:38 crc kubenswrapper[4744]: I1205 20:40:38.098031 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f95dec5c-262e-4af1-a941-7a0d0dc36853" path="/var/lib/kubelet/pods/f95dec5c-262e-4af1-a941-7a0d0dc36853/volumes" Dec 05 20:40:38 crc kubenswrapper[4744]: I1205 20:40:38.755464 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.195910 4744 generic.go:334] "Generic (PLEG): container finished" podID="b1865381-500a-4338-b035-3b2ad20bacb7" containerID="5d73214a3642252484df6ce52dacebae3c72ebc72fd64b160d8b1c3e2ce84296" exitCode=0 Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.195988 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1865381-500a-4338-b035-3b2ad20bacb7","Type":"ContainerDied","Data":"5d73214a3642252484df6ce52dacebae3c72ebc72fd64b160d8b1c3e2ce84296"} Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.196235 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b1865381-500a-4338-b035-3b2ad20bacb7","Type":"ContainerDied","Data":"3e5ff9046352cf3793beaa233f390123bde31f84a16bad561e50841bbccb5587"} Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.196254 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e5ff9046352cf3793beaa233f390123bde31f84a16bad561e50841bbccb5587" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.216535 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.338258 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-combined-ca-bundle\") pod \"b1865381-500a-4338-b035-3b2ad20bacb7\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.338308 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1865381-500a-4338-b035-3b2ad20bacb7-run-httpd\") pod \"b1865381-500a-4338-b035-3b2ad20bacb7\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.338411 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n94lw\" (UniqueName: \"kubernetes.io/projected/b1865381-500a-4338-b035-3b2ad20bacb7-kube-api-access-n94lw\") pod \"b1865381-500a-4338-b035-3b2ad20bacb7\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.338430 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-scripts\") pod \"b1865381-500a-4338-b035-3b2ad20bacb7\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.338507 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-ceilometer-tls-certs\") pod \"b1865381-500a-4338-b035-3b2ad20bacb7\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.338562 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-sg-core-conf-yaml\") pod \"b1865381-500a-4338-b035-3b2ad20bacb7\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.338592 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-config-data\") pod \"b1865381-500a-4338-b035-3b2ad20bacb7\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.338612 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1865381-500a-4338-b035-3b2ad20bacb7-log-httpd\") pod \"b1865381-500a-4338-b035-3b2ad20bacb7\" (UID: \"b1865381-500a-4338-b035-3b2ad20bacb7\") " Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.339000 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1865381-500a-4338-b035-3b2ad20bacb7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b1865381-500a-4338-b035-3b2ad20bacb7" (UID: "b1865381-500a-4338-b035-3b2ad20bacb7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.339180 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1865381-500a-4338-b035-3b2ad20bacb7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b1865381-500a-4338-b035-3b2ad20bacb7" (UID: "b1865381-500a-4338-b035-3b2ad20bacb7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.354212 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-scripts" (OuterVolumeSpecName: "scripts") pod "b1865381-500a-4338-b035-3b2ad20bacb7" (UID: "b1865381-500a-4338-b035-3b2ad20bacb7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.354272 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1865381-500a-4338-b035-3b2ad20bacb7-kube-api-access-n94lw" (OuterVolumeSpecName: "kube-api-access-n94lw") pod "b1865381-500a-4338-b035-3b2ad20bacb7" (UID: "b1865381-500a-4338-b035-3b2ad20bacb7"). InnerVolumeSpecName "kube-api-access-n94lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.367774 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b1865381-500a-4338-b035-3b2ad20bacb7" (UID: "b1865381-500a-4338-b035-3b2ad20bacb7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.408529 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b1865381-500a-4338-b035-3b2ad20bacb7" (UID: "b1865381-500a-4338-b035-3b2ad20bacb7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.415438 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1865381-500a-4338-b035-3b2ad20bacb7" (UID: "b1865381-500a-4338-b035-3b2ad20bacb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.440938 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n94lw\" (UniqueName: \"kubernetes.io/projected/b1865381-500a-4338-b035-3b2ad20bacb7-kube-api-access-n94lw\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.441157 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.441250 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.441359 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.441530 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1865381-500a-4338-b035-3b2ad20bacb7-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.441641 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1865381-500a-4338-b035-3b2ad20bacb7-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.441774 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.441936 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-config-data" (OuterVolumeSpecName: "config-data") pod "b1865381-500a-4338-b035-3b2ad20bacb7" (UID: "b1865381-500a-4338-b035-3b2ad20bacb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.543442 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1865381-500a-4338-b035-3b2ad20bacb7-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:39 crc kubenswrapper[4744]: I1205 20:40:39.993857 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.206869 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.229047 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.244063 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.262894 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:40 crc kubenswrapper[4744]: E1205 20:40:40.263281 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85ac7b0-b06b-4f8e-8330-088bb19d433f" containerName="cinder-backup" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263322 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85ac7b0-b06b-4f8e-8330-088bb19d433f" containerName="cinder-backup" Dec 05 20:40:40 crc kubenswrapper[4744]: E1205 20:40:40.263354 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0" containerName="mariadb-account-delete" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263362 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0" containerName="mariadb-account-delete" Dec 05 20:40:40 crc kubenswrapper[4744]: E1205 20:40:40.263375 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5018300a-b041-46cb-a989-0a490d4029c1" containerName="cinder-api" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263382 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5018300a-b041-46cb-a989-0a490d4029c1" containerName="cinder-api" Dec 05 20:40:40 crc kubenswrapper[4744]: E1205 20:40:40.263391 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="sg-core" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263397 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="sg-core" Dec 05 20:40:40 crc kubenswrapper[4744]: E1205 20:40:40.263404 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85ac7b0-b06b-4f8e-8330-088bb19d433f" containerName="probe" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263410 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85ac7b0-b06b-4f8e-8330-088bb19d433f" containerName="probe" Dec 05 20:40:40 crc kubenswrapper[4744]: E1205 20:40:40.263420 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5018300a-b041-46cb-a989-0a490d4029c1" containerName="cinder-api-log" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263426 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5018300a-b041-46cb-a989-0a490d4029c1" containerName="cinder-api-log" Dec 05 20:40:40 crc kubenswrapper[4744]: E1205 20:40:40.263437 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d9c66b1-f531-4967-86ae-287a1ce3a1c7" containerName="probe" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263443 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9c66b1-f531-4967-86ae-287a1ce3a1c7" containerName="probe" Dec 05 20:40:40 crc kubenswrapper[4744]: E1205 20:40:40.263454 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="ceilometer-central-agent" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263460 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="ceilometer-central-agent" Dec 05 20:40:40 crc kubenswrapper[4744]: E1205 20:40:40.263472 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="ceilometer-notification-agent" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263479 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="ceilometer-notification-agent" Dec 05 20:40:40 crc kubenswrapper[4744]: E1205 20:40:40.263487 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="proxy-httpd" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263493 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="proxy-httpd" Dec 05 20:40:40 crc kubenswrapper[4744]: E1205 20:40:40.263504 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d9c66b1-f531-4967-86ae-287a1ce3a1c7" containerName="cinder-scheduler" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263510 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9c66b1-f531-4967-86ae-287a1ce3a1c7" containerName="cinder-scheduler" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263667 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d9c66b1-f531-4967-86ae-287a1ce3a1c7" containerName="cinder-scheduler" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263678 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b3f2b01-6a2c-4cec-baf1-f3ed4b1104e0" containerName="mariadb-account-delete" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263687 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="proxy-httpd" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263701 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="sg-core" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263714 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d9c66b1-f531-4967-86ae-287a1ce3a1c7" containerName="probe" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263728 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="ceilometer-central-agent" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263740 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5018300a-b041-46cb-a989-0a490d4029c1" containerName="cinder-api" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263747 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" containerName="ceilometer-notification-agent" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263757 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85ac7b0-b06b-4f8e-8330-088bb19d433f" containerName="probe" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263767 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85ac7b0-b06b-4f8e-8330-088bb19d433f" containerName="cinder-backup" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.263781 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5018300a-b041-46cb-a989-0a490d4029c1" containerName="cinder-api-log" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.269572 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.273122 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.273411 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.273562 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.278033 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.457531 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.457568 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmwft\" (UniqueName: \"kubernetes.io/projected/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-kube-api-access-kmwft\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.457616 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-scripts\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.457653 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-run-httpd\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.457669 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.457692 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-log-httpd\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.457710 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-config-data\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.457741 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.559427 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-log-httpd\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.559476 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-config-data\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.559528 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.559589 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.559612 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmwft\" (UniqueName: \"kubernetes.io/projected/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-kube-api-access-kmwft\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.559666 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-scripts\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.559712 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-run-httpd\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.559730 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.560696 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-log-httpd\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.561179 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-run-httpd\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.564382 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.564736 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-scripts\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.564853 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.565597 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.566312 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-config-data\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.578673 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmwft\" (UniqueName: \"kubernetes.io/projected/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-kube-api-access-kmwft\") pod \"ceilometer-0\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:40 crc kubenswrapper[4744]: I1205 20:40:40.618817 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:41 crc kubenswrapper[4744]: I1205 20:40:41.062222 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:41 crc kubenswrapper[4744]: I1205 20:40:41.149231 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:41 crc kubenswrapper[4744]: I1205 20:40:41.218405 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7a91ddff-ea50-4eed-a5c2-93f13b4abd56","Type":"ContainerStarted","Data":"a54c02d7394a59a0164cd042c1601b98505d7843ac994781eb3b22515110560e"} Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.091319 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1865381-500a-4338-b035-3b2ad20bacb7" path="/var/lib/kubelet/pods/b1865381-500a-4338-b035-3b2ad20bacb7/volumes" Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.241857 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7a91ddff-ea50-4eed-a5c2-93f13b4abd56","Type":"ContainerStarted","Data":"d313866aed818743e6b1dcb19321a72744910236a4e9f134b3bf8b2ccbbe0b84"} Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.372768 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_e1cfd875-844b-4246-b4fd-9286f7f4ca81/watcher-decision-engine/0.log" Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.717924 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.902384 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9km96\" (UniqueName: \"kubernetes.io/projected/e1cfd875-844b-4246-b4fd-9286f7f4ca81-kube-api-access-9km96\") pod \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.902767 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-config-data\") pod \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.902806 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-custom-prometheus-ca\") pod \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.902869 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-cert-memcached-mtls\") pod \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.902907 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1cfd875-844b-4246-b4fd-9286f7f4ca81-logs\") pod \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.903042 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-combined-ca-bundle\") pod \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\" (UID: \"e1cfd875-844b-4246-b4fd-9286f7f4ca81\") " Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.903323 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1cfd875-844b-4246-b4fd-9286f7f4ca81-logs" (OuterVolumeSpecName: "logs") pod "e1cfd875-844b-4246-b4fd-9286f7f4ca81" (UID: "e1cfd875-844b-4246-b4fd-9286f7f4ca81"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.903440 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1cfd875-844b-4246-b4fd-9286f7f4ca81-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.906838 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1cfd875-844b-4246-b4fd-9286f7f4ca81-kube-api-access-9km96" (OuterVolumeSpecName: "kube-api-access-9km96") pod "e1cfd875-844b-4246-b4fd-9286f7f4ca81" (UID: "e1cfd875-844b-4246-b4fd-9286f7f4ca81"). InnerVolumeSpecName "kube-api-access-9km96". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.926999 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1cfd875-844b-4246-b4fd-9286f7f4ca81" (UID: "e1cfd875-844b-4246-b4fd-9286f7f4ca81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.927571 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "e1cfd875-844b-4246-b4fd-9286f7f4ca81" (UID: "e1cfd875-844b-4246-b4fd-9286f7f4ca81"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:42 crc kubenswrapper[4744]: I1205 20:40:42.950769 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-config-data" (OuterVolumeSpecName: "config-data") pod "e1cfd875-844b-4246-b4fd-9286f7f4ca81" (UID: "e1cfd875-844b-4246-b4fd-9286f7f4ca81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.004844 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.004882 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9km96\" (UniqueName: \"kubernetes.io/projected/e1cfd875-844b-4246-b4fd-9286f7f4ca81-kube-api-access-9km96\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.004895 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.004905 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.113090 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "e1cfd875-844b-4246-b4fd-9286f7f4ca81" (UID: "e1cfd875-844b-4246-b4fd-9286f7f4ca81"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.207497 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e1cfd875-844b-4246-b4fd-9286f7f4ca81-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.265774 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7a91ddff-ea50-4eed-a5c2-93f13b4abd56","Type":"ContainerStarted","Data":"fce23b8b5d7310add4cafdcad32bf1ef3f1d089c67eec3779049ace5f03cf9b1"} Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.265823 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7a91ddff-ea50-4eed-a5c2-93f13b4abd56","Type":"ContainerStarted","Data":"ebcbc3dfa8120f968315f6fc6d5a54fef03796bec1c57fda67ae748e42f867fd"} Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.269335 4744 generic.go:334] "Generic (PLEG): container finished" podID="e1cfd875-844b-4246-b4fd-9286f7f4ca81" containerID="72e13954de7b622fa9b513e1852f69c6f62f1d2b7725b073bb14936cb988e468" exitCode=0 Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.269380 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"e1cfd875-844b-4246-b4fd-9286f7f4ca81","Type":"ContainerDied","Data":"72e13954de7b622fa9b513e1852f69c6f62f1d2b7725b073bb14936cb988e468"} Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.269409 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"e1cfd875-844b-4246-b4fd-9286f7f4ca81","Type":"ContainerDied","Data":"09219520111b2defa005eda395f59200d6d6c7f8c61783e6af876aa2d9e836b6"} Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.269500 4744 scope.go:117] "RemoveContainer" containerID="72e13954de7b622fa9b513e1852f69c6f62f1d2b7725b073bb14936cb988e468" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.269697 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.298421 4744 scope.go:117] "RemoveContainer" containerID="72e13954de7b622fa9b513e1852f69c6f62f1d2b7725b073bb14936cb988e468" Dec 05 20:40:43 crc kubenswrapper[4744]: E1205 20:40:43.299065 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72e13954de7b622fa9b513e1852f69c6f62f1d2b7725b073bb14936cb988e468\": container with ID starting with 72e13954de7b622fa9b513e1852f69c6f62f1d2b7725b073bb14936cb988e468 not found: ID does not exist" containerID="72e13954de7b622fa9b513e1852f69c6f62f1d2b7725b073bb14936cb988e468" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.299354 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72e13954de7b622fa9b513e1852f69c6f62f1d2b7725b073bb14936cb988e468"} err="failed to get container status \"72e13954de7b622fa9b513e1852f69c6f62f1d2b7725b073bb14936cb988e468\": rpc error: code = NotFound desc = could not find container \"72e13954de7b622fa9b513e1852f69c6f62f1d2b7725b073bb14936cb988e468\": container with ID starting with 72e13954de7b622fa9b513e1852f69c6f62f1d2b7725b073bb14936cb988e468 not found: ID does not exist" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.327232 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.345593 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.357857 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:40:43 crc kubenswrapper[4744]: E1205 20:40:43.358260 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1cfd875-844b-4246-b4fd-9286f7f4ca81" containerName="watcher-decision-engine" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.358277 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1cfd875-844b-4246-b4fd-9286f7f4ca81" containerName="watcher-decision-engine" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.358458 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1cfd875-844b-4246-b4fd-9286f7f4ca81" containerName="watcher-decision-engine" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.359022 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.362084 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.369322 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.512325 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.512383 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.512443 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9971832a-5acf-49d8-aba0-3a1425733875-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.512466 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss9cw\" (UniqueName: \"kubernetes.io/projected/9971832a-5acf-49d8-aba0-3a1425733875-kube-api-access-ss9cw\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.512500 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.512651 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.613938 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9971832a-5acf-49d8-aba0-3a1425733875-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.613998 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss9cw\" (UniqueName: \"kubernetes.io/projected/9971832a-5acf-49d8-aba0-3a1425733875-kube-api-access-ss9cw\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.614049 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.614084 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.614130 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.614162 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.614937 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9971832a-5acf-49d8-aba0-3a1425733875-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.628089 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.628162 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.628472 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.628498 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.632849 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss9cw\" (UniqueName: \"kubernetes.io/projected/9971832a-5acf-49d8-aba0-3a1425733875-kube-api-access-ss9cw\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:43 crc kubenswrapper[4744]: I1205 20:40:43.680448 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:44 crc kubenswrapper[4744]: I1205 20:40:44.096193 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1cfd875-844b-4246-b4fd-9286f7f4ca81" path="/var/lib/kubelet/pods/e1cfd875-844b-4246-b4fd-9286f7f4ca81/volumes" Dec 05 20:40:44 crc kubenswrapper[4744]: I1205 20:40:44.170310 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:40:44 crc kubenswrapper[4744]: W1205 20:40:44.172987 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9971832a_5acf_49d8_aba0_3a1425733875.slice/crio-3b965ca41f8554ee02d86346ca241cf8c43f598e39853a7ae9c564c16ba79d62 WatchSource:0}: Error finding container 3b965ca41f8554ee02d86346ca241cf8c43f598e39853a7ae9c564c16ba79d62: Status 404 returned error can't find the container with id 3b965ca41f8554ee02d86346ca241cf8c43f598e39853a7ae9c564c16ba79d62 Dec 05 20:40:44 crc kubenswrapper[4744]: I1205 20:40:44.278871 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"9971832a-5acf-49d8-aba0-3a1425733875","Type":"ContainerStarted","Data":"3b965ca41f8554ee02d86346ca241cf8c43f598e39853a7ae9c564c16ba79d62"} Dec 05 20:40:44 crc kubenswrapper[4744]: I1205 20:40:44.281486 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7a91ddff-ea50-4eed-a5c2-93f13b4abd56","Type":"ContainerStarted","Data":"e7d5c6d41c77fe170228c097f4eca79210d72cb3a30957655e3b0f11cd2ccfd5"} Dec 05 20:40:44 crc kubenswrapper[4744]: I1205 20:40:44.309915 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.387572885 podStartE2EDuration="4.309892751s" podCreationTimestamp="2025-12-05 20:40:40 +0000 UTC" firstStartedPulling="2025-12-05 20:40:41.08811744 +0000 UTC m=+1811.317928828" lastFinishedPulling="2025-12-05 20:40:44.010437286 +0000 UTC m=+1814.240248694" observedRunningTime="2025-12-05 20:40:44.308545028 +0000 UTC m=+1814.538356396" watchObservedRunningTime="2025-12-05 20:40:44.309892751 +0000 UTC m=+1814.539704119" Dec 05 20:40:45 crc kubenswrapper[4744]: I1205 20:40:45.291324 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"9971832a-5acf-49d8-aba0-3a1425733875","Type":"ContainerStarted","Data":"dd051490cc0ec5f3e16172044c3a2d78da102178845268cb3b2e646f3370019f"} Dec 05 20:40:45 crc kubenswrapper[4744]: I1205 20:40:45.292436 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:45 crc kubenswrapper[4744]: I1205 20:40:45.325884 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.32587044 podStartE2EDuration="2.32587044s" podCreationTimestamp="2025-12-05 20:40:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:40:45.321431101 +0000 UTC m=+1815.551242469" watchObservedRunningTime="2025-12-05 20:40:45.32587044 +0000 UTC m=+1815.555681808" Dec 05 20:40:45 crc kubenswrapper[4744]: I1205 20:40:45.851062 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_9971832a-5acf-49d8-aba0-3a1425733875/watcher-decision-engine/0.log" Dec 05 20:40:47 crc kubenswrapper[4744]: I1205 20:40:47.043176 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_9971832a-5acf-49d8-aba0-3a1425733875/watcher-decision-engine/0.log" Dec 05 20:40:48 crc kubenswrapper[4744]: I1205 20:40:48.278942 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_9971832a-5acf-49d8-aba0-3a1425733875/watcher-decision-engine/0.log" Dec 05 20:40:49 crc kubenswrapper[4744]: I1205 20:40:49.080655 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:40:49 crc kubenswrapper[4744]: E1205 20:40:49.081032 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:40:50 crc kubenswrapper[4744]: I1205 20:40:50.046958 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_9971832a-5acf-49d8-aba0-3a1425733875/watcher-decision-engine/0.log" Dec 05 20:40:51 crc kubenswrapper[4744]: I1205 20:40:51.279281 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_9971832a-5acf-49d8-aba0-3a1425733875/watcher-decision-engine/0.log" Dec 05 20:40:52 crc kubenswrapper[4744]: I1205 20:40:52.524431 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_9971832a-5acf-49d8-aba0-3a1425733875/watcher-decision-engine/0.log" Dec 05 20:40:53 crc kubenswrapper[4744]: I1205 20:40:53.680757 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:53 crc kubenswrapper[4744]: I1205 20:40:53.718655 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:53 crc kubenswrapper[4744]: I1205 20:40:53.802164 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_9971832a-5acf-49d8-aba0-3a1425733875/watcher-decision-engine/0.log" Dec 05 20:40:54 crc kubenswrapper[4744]: I1205 20:40:54.416449 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:54 crc kubenswrapper[4744]: I1205 20:40:54.459335 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.014628 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_9971832a-5acf-49d8-aba0-3a1425733875/watcher-decision-engine/0.log" Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.148917 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zjftx"] Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.155883 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zjftx"] Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.206217 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.206529 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="07c4b443-fda4-4eca-b1ad-4423b01e3aad" containerName="watcher-applier" containerID="cri-o://dd76b0bb0a339f67a079b017ab055266bfd00727403284b81bfe4483240aed85" gracePeriod=30 Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.219115 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcherab56-account-delete-95cgp"] Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.220101 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherab56-account-delete-95cgp" Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.225255 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcherab56-account-delete-95cgp"] Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.252688 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1295f8df-82f8-40d4-b7af-c1c2b2f03016-operator-scripts\") pod \"watcherab56-account-delete-95cgp\" (UID: \"1295f8df-82f8-40d4-b7af-c1c2b2f03016\") " pod="watcher-kuttl-default/watcherab56-account-delete-95cgp" Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.252732 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvcpw\" (UniqueName: \"kubernetes.io/projected/1295f8df-82f8-40d4-b7af-c1c2b2f03016-kube-api-access-vvcpw\") pod \"watcherab56-account-delete-95cgp\" (UID: \"1295f8df-82f8-40d4-b7af-c1c2b2f03016\") " pod="watcher-kuttl-default/watcherab56-account-delete-95cgp" Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.264470 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.292347 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.292555 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" containerName="watcher-kuttl-api-log" containerID="cri-o://a0599f3dec2c07796ce0a3dfeb132cb40663c4bc85ec1b13c669b5d202b8a2d7" gracePeriod=30 Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.292881 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" containerName="watcher-api" containerID="cri-o://b1a337111fad3b293940c5a211a7482d3f2534b5f27a314bbd0ce2fa0dde64d2" gracePeriod=30 Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.353176 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1295f8df-82f8-40d4-b7af-c1c2b2f03016-operator-scripts\") pod \"watcherab56-account-delete-95cgp\" (UID: \"1295f8df-82f8-40d4-b7af-c1c2b2f03016\") " pod="watcher-kuttl-default/watcherab56-account-delete-95cgp" Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.353224 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvcpw\" (UniqueName: \"kubernetes.io/projected/1295f8df-82f8-40d4-b7af-c1c2b2f03016-kube-api-access-vvcpw\") pod \"watcherab56-account-delete-95cgp\" (UID: \"1295f8df-82f8-40d4-b7af-c1c2b2f03016\") " pod="watcher-kuttl-default/watcherab56-account-delete-95cgp" Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.354360 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1295f8df-82f8-40d4-b7af-c1c2b2f03016-operator-scripts\") pod \"watcherab56-account-delete-95cgp\" (UID: \"1295f8df-82f8-40d4-b7af-c1c2b2f03016\") " pod="watcher-kuttl-default/watcherab56-account-delete-95cgp" Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.378439 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvcpw\" (UniqueName: \"kubernetes.io/projected/1295f8df-82f8-40d4-b7af-c1c2b2f03016-kube-api-access-vvcpw\") pod \"watcherab56-account-delete-95cgp\" (UID: \"1295f8df-82f8-40d4-b7af-c1c2b2f03016\") " pod="watcher-kuttl-default/watcherab56-account-delete-95cgp" Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.428457 4744 generic.go:334] "Generic (PLEG): container finished" podID="5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" containerID="a0599f3dec2c07796ce0a3dfeb132cb40663c4bc85ec1b13c669b5d202b8a2d7" exitCode=143 Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.428524 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18","Type":"ContainerDied","Data":"a0599f3dec2c07796ce0a3dfeb132cb40663c4bc85ec1b13c669b5d202b8a2d7"} Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.429027 4744 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" secret="" err="secret \"watcher-watcher-kuttl-dockercfg-mdbsh\" not found" Dec 05 20:40:55 crc kubenswrapper[4744]: I1205 20:40:55.546069 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherab56-account-delete-95cgp" Dec 05 20:40:55 crc kubenswrapper[4744]: E1205 20:40:55.556301 4744 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:40:55 crc kubenswrapper[4744]: E1205 20:40:55.556361 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-config-data podName:9971832a-5acf-49d8-aba0-3a1425733875 nodeName:}" failed. No retries permitted until 2025-12-05 20:40:56.05634594 +0000 UTC m=+1826.286157308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "9971832a-5acf-49d8-aba0-3a1425733875") : secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.006890 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcherab56-account-delete-95cgp"] Dec 05 20:40:56 crc kubenswrapper[4744]: W1205 20:40:56.020178 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1295f8df_82f8_40d4_b7af_c1c2b2f03016.slice/crio-09d746aaeaaf7c7cb46ce90dcac4d6f30cb36750c7dd36790635e421f3fbcd3f WatchSource:0}: Error finding container 09d746aaeaaf7c7cb46ce90dcac4d6f30cb36750c7dd36790635e421f3fbcd3f: Status 404 returned error can't find the container with id 09d746aaeaaf7c7cb46ce90dcac4d6f30cb36750c7dd36790635e421f3fbcd3f Dec 05 20:40:56 crc kubenswrapper[4744]: E1205 20:40:56.067029 4744 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:40:56 crc kubenswrapper[4744]: E1205 20:40:56.067132 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-config-data podName:9971832a-5acf-49d8-aba0-3a1425733875 nodeName:}" failed. No retries permitted until 2025-12-05 20:40:57.067107997 +0000 UTC m=+1827.296919375 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "9971832a-5acf-49d8-aba0-3a1425733875") : secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.101898 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96afaf9f-e24d-42ae-8d45-dcc85b9663c9" path="/var/lib/kubelet/pods/96afaf9f-e24d-42ae-8d45-dcc85b9663c9/volumes" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.227888 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.371911 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-config-data\") pod \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.372014 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-cert-memcached-mtls\") pod \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.372073 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-logs\") pod \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.372115 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-combined-ca-bundle\") pod \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.372134 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-custom-prometheus-ca\") pod \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.372184 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5p79\" (UniqueName: \"kubernetes.io/projected/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-kube-api-access-t5p79\") pod \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\" (UID: \"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18\") " Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.372663 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-logs" (OuterVolumeSpecName: "logs") pod "5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" (UID: "5b3ca55d-f6d9-4b59-81b4-4295f3b20d18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.385603 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-kube-api-access-t5p79" (OuterVolumeSpecName: "kube-api-access-t5p79") pod "5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" (UID: "5b3ca55d-f6d9-4b59-81b4-4295f3b20d18"). InnerVolumeSpecName "kube-api-access-t5p79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.400773 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" (UID: "5b3ca55d-f6d9-4b59-81b4-4295f3b20d18"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.408451 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" (UID: "5b3ca55d-f6d9-4b59-81b4-4295f3b20d18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.436001 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-config-data" (OuterVolumeSpecName: "config-data") pod "5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" (UID: "5b3ca55d-f6d9-4b59-81b4-4295f3b20d18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.439051 4744 generic.go:334] "Generic (PLEG): container finished" podID="5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" containerID="b1a337111fad3b293940c5a211a7482d3f2534b5f27a314bbd0ce2fa0dde64d2" exitCode=0 Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.439127 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18","Type":"ContainerDied","Data":"b1a337111fad3b293940c5a211a7482d3f2534b5f27a314bbd0ce2fa0dde64d2"} Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.439134 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.439160 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5b3ca55d-f6d9-4b59-81b4-4295f3b20d18","Type":"ContainerDied","Data":"100420df580e9186e391e63ce87bed4ebf97acbeb04c654e16b08382981e3c54"} Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.439180 4744 scope.go:117] "RemoveContainer" containerID="b1a337111fad3b293940c5a211a7482d3f2534b5f27a314bbd0ce2fa0dde64d2" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.441193 4744 generic.go:334] "Generic (PLEG): container finished" podID="1295f8df-82f8-40d4-b7af-c1c2b2f03016" containerID="76f4447081f9e62d788f17c32db1b7f2c28204661cd660c045adbc35ffaaf9b0" exitCode=0 Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.441354 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="9971832a-5acf-49d8-aba0-3a1425733875" containerName="watcher-decision-engine" containerID="cri-o://dd051490cc0ec5f3e16172044c3a2d78da102178845268cb3b2e646f3370019f" gracePeriod=30 Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.441613 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcherab56-account-delete-95cgp" event={"ID":"1295f8df-82f8-40d4-b7af-c1c2b2f03016","Type":"ContainerDied","Data":"76f4447081f9e62d788f17c32db1b7f2c28204661cd660c045adbc35ffaaf9b0"} Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.441633 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcherab56-account-delete-95cgp" event={"ID":"1295f8df-82f8-40d4-b7af-c1c2b2f03016","Type":"ContainerStarted","Data":"09d746aaeaaf7c7cb46ce90dcac4d6f30cb36750c7dd36790635e421f3fbcd3f"} Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.453237 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" (UID: "5b3ca55d-f6d9-4b59-81b4-4295f3b20d18"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.474718 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.474749 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.474762 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.474773 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.474785 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5p79\" (UniqueName: \"kubernetes.io/projected/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-kube-api-access-t5p79\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.474793 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.571668 4744 scope.go:117] "RemoveContainer" containerID="a0599f3dec2c07796ce0a3dfeb132cb40663c4bc85ec1b13c669b5d202b8a2d7" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.593424 4744 scope.go:117] "RemoveContainer" containerID="b1a337111fad3b293940c5a211a7482d3f2534b5f27a314bbd0ce2fa0dde64d2" Dec 05 20:40:56 crc kubenswrapper[4744]: E1205 20:40:56.593779 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a337111fad3b293940c5a211a7482d3f2534b5f27a314bbd0ce2fa0dde64d2\": container with ID starting with b1a337111fad3b293940c5a211a7482d3f2534b5f27a314bbd0ce2fa0dde64d2 not found: ID does not exist" containerID="b1a337111fad3b293940c5a211a7482d3f2534b5f27a314bbd0ce2fa0dde64d2" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.593809 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a337111fad3b293940c5a211a7482d3f2534b5f27a314bbd0ce2fa0dde64d2"} err="failed to get container status \"b1a337111fad3b293940c5a211a7482d3f2534b5f27a314bbd0ce2fa0dde64d2\": rpc error: code = NotFound desc = could not find container \"b1a337111fad3b293940c5a211a7482d3f2534b5f27a314bbd0ce2fa0dde64d2\": container with ID starting with b1a337111fad3b293940c5a211a7482d3f2534b5f27a314bbd0ce2fa0dde64d2 not found: ID does not exist" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.593833 4744 scope.go:117] "RemoveContainer" containerID="a0599f3dec2c07796ce0a3dfeb132cb40663c4bc85ec1b13c669b5d202b8a2d7" Dec 05 20:40:56 crc kubenswrapper[4744]: E1205 20:40:56.593988 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0599f3dec2c07796ce0a3dfeb132cb40663c4bc85ec1b13c669b5d202b8a2d7\": container with ID starting with a0599f3dec2c07796ce0a3dfeb132cb40663c4bc85ec1b13c669b5d202b8a2d7 not found: ID does not exist" containerID="a0599f3dec2c07796ce0a3dfeb132cb40663c4bc85ec1b13c669b5d202b8a2d7" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.594008 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0599f3dec2c07796ce0a3dfeb132cb40663c4bc85ec1b13c669b5d202b8a2d7"} err="failed to get container status \"a0599f3dec2c07796ce0a3dfeb132cb40663c4bc85ec1b13c669b5d202b8a2d7\": rpc error: code = NotFound desc = could not find container \"a0599f3dec2c07796ce0a3dfeb132cb40663c4bc85ec1b13c669b5d202b8a2d7\": container with ID starting with a0599f3dec2c07796ce0a3dfeb132cb40663c4bc85ec1b13c669b5d202b8a2d7 not found: ID does not exist" Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.771520 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:40:56 crc kubenswrapper[4744]: I1205 20:40:56.776868 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:40:57 crc kubenswrapper[4744]: E1205 20:40:57.090979 4744 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:40:57 crc kubenswrapper[4744]: E1205 20:40:57.091076 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-config-data podName:9971832a-5acf-49d8-aba0-3a1425733875 nodeName:}" failed. No retries permitted until 2025-12-05 20:40:59.09105487 +0000 UTC m=+1829.320866238 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "9971832a-5acf-49d8-aba0-3a1425733875") : secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:40:57 crc kubenswrapper[4744]: I1205 20:40:57.458607 4744 generic.go:334] "Generic (PLEG): container finished" podID="07c4b443-fda4-4eca-b1ad-4423b01e3aad" containerID="dd76b0bb0a339f67a079b017ab055266bfd00727403284b81bfe4483240aed85" exitCode=0 Dec 05 20:40:57 crc kubenswrapper[4744]: I1205 20:40:57.458758 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"07c4b443-fda4-4eca-b1ad-4423b01e3aad","Type":"ContainerDied","Data":"dd76b0bb0a339f67a079b017ab055266bfd00727403284b81bfe4483240aed85"} Dec 05 20:40:57 crc kubenswrapper[4744]: I1205 20:40:57.547117 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:57 crc kubenswrapper[4744]: I1205 20:40:57.547462 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="ceilometer-central-agent" containerID="cri-o://d313866aed818743e6b1dcb19321a72744910236a4e9f134b3bf8b2ccbbe0b84" gracePeriod=30 Dec 05 20:40:57 crc kubenswrapper[4744]: I1205 20:40:57.547592 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="proxy-httpd" containerID="cri-o://e7d5c6d41c77fe170228c097f4eca79210d72cb3a30957655e3b0f11cd2ccfd5" gracePeriod=30 Dec 05 20:40:57 crc kubenswrapper[4744]: I1205 20:40:57.547641 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="sg-core" containerID="cri-o://fce23b8b5d7310add4cafdcad32bf1ef3f1d089c67eec3779049ace5f03cf9b1" gracePeriod=30 Dec 05 20:40:57 crc kubenswrapper[4744]: I1205 20:40:57.547680 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="ceilometer-notification-agent" containerID="cri-o://ebcbc3dfa8120f968315f6fc6d5a54fef03796bec1c57fda67ae748e42f867fd" gracePeriod=30 Dec 05 20:40:57 crc kubenswrapper[4744]: I1205 20:40:57.557672 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.206:3000/\": EOF" Dec 05 20:40:57 crc kubenswrapper[4744]: I1205 20:40:57.869022 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:40:57 crc kubenswrapper[4744]: I1205 20:40:57.896694 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherab56-account-delete-95cgp" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.007388 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07c4b443-fda4-4eca-b1ad-4423b01e3aad-logs\") pod \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.007431 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-config-data\") pod \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.007468 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1295f8df-82f8-40d4-b7af-c1c2b2f03016-operator-scripts\") pod \"1295f8df-82f8-40d4-b7af-c1c2b2f03016\" (UID: \"1295f8df-82f8-40d4-b7af-c1c2b2f03016\") " Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.007491 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t7x6\" (UniqueName: \"kubernetes.io/projected/07c4b443-fda4-4eca-b1ad-4423b01e3aad-kube-api-access-6t7x6\") pod \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.007583 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-combined-ca-bundle\") pod \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.007639 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvcpw\" (UniqueName: \"kubernetes.io/projected/1295f8df-82f8-40d4-b7af-c1c2b2f03016-kube-api-access-vvcpw\") pod \"1295f8df-82f8-40d4-b7af-c1c2b2f03016\" (UID: \"1295f8df-82f8-40d4-b7af-c1c2b2f03016\") " Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.007677 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-cert-memcached-mtls\") pod \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\" (UID: \"07c4b443-fda4-4eca-b1ad-4423b01e3aad\") " Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.008218 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1295f8df-82f8-40d4-b7af-c1c2b2f03016-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1295f8df-82f8-40d4-b7af-c1c2b2f03016" (UID: "1295f8df-82f8-40d4-b7af-c1c2b2f03016"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.008503 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07c4b443-fda4-4eca-b1ad-4423b01e3aad-logs" (OuterVolumeSpecName: "logs") pod "07c4b443-fda4-4eca-b1ad-4423b01e3aad" (UID: "07c4b443-fda4-4eca-b1ad-4423b01e3aad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.012328 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1295f8df-82f8-40d4-b7af-c1c2b2f03016-kube-api-access-vvcpw" (OuterVolumeSpecName: "kube-api-access-vvcpw") pod "1295f8df-82f8-40d4-b7af-c1c2b2f03016" (UID: "1295f8df-82f8-40d4-b7af-c1c2b2f03016"). InnerVolumeSpecName "kube-api-access-vvcpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.012374 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07c4b443-fda4-4eca-b1ad-4423b01e3aad-kube-api-access-6t7x6" (OuterVolumeSpecName: "kube-api-access-6t7x6") pod "07c4b443-fda4-4eca-b1ad-4423b01e3aad" (UID: "07c4b443-fda4-4eca-b1ad-4423b01e3aad"). InnerVolumeSpecName "kube-api-access-6t7x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.037262 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07c4b443-fda4-4eca-b1ad-4423b01e3aad" (UID: "07c4b443-fda4-4eca-b1ad-4423b01e3aad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.056103 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-config-data" (OuterVolumeSpecName: "config-data") pod "07c4b443-fda4-4eca-b1ad-4423b01e3aad" (UID: "07c4b443-fda4-4eca-b1ad-4423b01e3aad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.073042 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "07c4b443-fda4-4eca-b1ad-4423b01e3aad" (UID: "07c4b443-fda4-4eca-b1ad-4423b01e3aad"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.090284 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" path="/var/lib/kubelet/pods/5b3ca55d-f6d9-4b59-81b4-4295f3b20d18/volumes" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.109397 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvcpw\" (UniqueName: \"kubernetes.io/projected/1295f8df-82f8-40d4-b7af-c1c2b2f03016-kube-api-access-vvcpw\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.109429 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.109441 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07c4b443-fda4-4eca-b1ad-4423b01e3aad-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.109451 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.109462 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1295f8df-82f8-40d4-b7af-c1c2b2f03016-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.109474 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t7x6\" (UniqueName: \"kubernetes.io/projected/07c4b443-fda4-4eca-b1ad-4423b01e3aad-kube-api-access-6t7x6\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.109484 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c4b443-fda4-4eca-b1ad-4423b01e3aad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.480899 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.480888 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"07c4b443-fda4-4eca-b1ad-4423b01e3aad","Type":"ContainerDied","Data":"941ad5be3ec32832dd02da5ebc14716d95ec1e9e0d54960438271fce610fc57d"} Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.481859 4744 scope.go:117] "RemoveContainer" containerID="dd76b0bb0a339f67a079b017ab055266bfd00727403284b81bfe4483240aed85" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.485108 4744 generic.go:334] "Generic (PLEG): container finished" podID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerID="e7d5c6d41c77fe170228c097f4eca79210d72cb3a30957655e3b0f11cd2ccfd5" exitCode=0 Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.485152 4744 generic.go:334] "Generic (PLEG): container finished" podID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerID="fce23b8b5d7310add4cafdcad32bf1ef3f1d089c67eec3779049ace5f03cf9b1" exitCode=2 Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.485162 4744 generic.go:334] "Generic (PLEG): container finished" podID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerID="ebcbc3dfa8120f968315f6fc6d5a54fef03796bec1c57fda67ae748e42f867fd" exitCode=0 Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.485173 4744 generic.go:334] "Generic (PLEG): container finished" podID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerID="d313866aed818743e6b1dcb19321a72744910236a4e9f134b3bf8b2ccbbe0b84" exitCode=0 Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.485184 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7a91ddff-ea50-4eed-a5c2-93f13b4abd56","Type":"ContainerDied","Data":"e7d5c6d41c77fe170228c097f4eca79210d72cb3a30957655e3b0f11cd2ccfd5"} Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.485221 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7a91ddff-ea50-4eed-a5c2-93f13b4abd56","Type":"ContainerDied","Data":"fce23b8b5d7310add4cafdcad32bf1ef3f1d089c67eec3779049ace5f03cf9b1"} Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.485234 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7a91ddff-ea50-4eed-a5c2-93f13b4abd56","Type":"ContainerDied","Data":"ebcbc3dfa8120f968315f6fc6d5a54fef03796bec1c57fda67ae748e42f867fd"} Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.485246 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7a91ddff-ea50-4eed-a5c2-93f13b4abd56","Type":"ContainerDied","Data":"d313866aed818743e6b1dcb19321a72744910236a4e9f134b3bf8b2ccbbe0b84"} Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.485257 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7a91ddff-ea50-4eed-a5c2-93f13b4abd56","Type":"ContainerDied","Data":"a54c02d7394a59a0164cd042c1601b98505d7843ac994781eb3b22515110560e"} Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.485269 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a54c02d7394a59a0164cd042c1601b98505d7843ac994781eb3b22515110560e" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.486601 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcherab56-account-delete-95cgp" event={"ID":"1295f8df-82f8-40d4-b7af-c1c2b2f03016","Type":"ContainerDied","Data":"09d746aaeaaf7c7cb46ce90dcac4d6f30cb36750c7dd36790635e421f3fbcd3f"} Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.486623 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09d746aaeaaf7c7cb46ce90dcac4d6f30cb36750c7dd36790635e421f3fbcd3f" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.486667 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherab56-account-delete-95cgp" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.535255 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.551806 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.564168 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.718308 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-sg-core-conf-yaml\") pod \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.718382 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-ceilometer-tls-certs\") pod \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.718414 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-combined-ca-bundle\") pod \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.718475 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-config-data\") pod \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.718496 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-scripts\") pod \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.718576 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-log-httpd\") pod \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.718618 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmwft\" (UniqueName: \"kubernetes.io/projected/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-kube-api-access-kmwft\") pod \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.718695 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-run-httpd\") pod \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\" (UID: \"7a91ddff-ea50-4eed-a5c2-93f13b4abd56\") " Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.719361 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7a91ddff-ea50-4eed-a5c2-93f13b4abd56" (UID: "7a91ddff-ea50-4eed-a5c2-93f13b4abd56"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.719867 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7a91ddff-ea50-4eed-a5c2-93f13b4abd56" (UID: "7a91ddff-ea50-4eed-a5c2-93f13b4abd56"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.720023 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.720039 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.723614 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-kube-api-access-kmwft" (OuterVolumeSpecName: "kube-api-access-kmwft") pod "7a91ddff-ea50-4eed-a5c2-93f13b4abd56" (UID: "7a91ddff-ea50-4eed-a5c2-93f13b4abd56"). InnerVolumeSpecName "kube-api-access-kmwft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.734579 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-scripts" (OuterVolumeSpecName: "scripts") pod "7a91ddff-ea50-4eed-a5c2-93f13b4abd56" (UID: "7a91ddff-ea50-4eed-a5c2-93f13b4abd56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.746207 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7a91ddff-ea50-4eed-a5c2-93f13b4abd56" (UID: "7a91ddff-ea50-4eed-a5c2-93f13b4abd56"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.777230 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7a91ddff-ea50-4eed-a5c2-93f13b4abd56" (UID: "7a91ddff-ea50-4eed-a5c2-93f13b4abd56"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.800521 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a91ddff-ea50-4eed-a5c2-93f13b4abd56" (UID: "7a91ddff-ea50-4eed-a5c2-93f13b4abd56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.822268 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.822336 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmwft\" (UniqueName: \"kubernetes.io/projected/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-kube-api-access-kmwft\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.822354 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.822369 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.822384 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.837124 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-config-data" (OuterVolumeSpecName: "config-data") pod "7a91ddff-ea50-4eed-a5c2-93f13b4abd56" (UID: "7a91ddff-ea50-4eed-a5c2-93f13b4abd56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:40:58 crc kubenswrapper[4744]: I1205 20:40:58.923572 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a91ddff-ea50-4eed-a5c2-93f13b4abd56-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:40:59 crc kubenswrapper[4744]: E1205 20:40:59.126688 4744 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:40:59 crc kubenswrapper[4744]: E1205 20:40:59.127039 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-config-data podName:9971832a-5acf-49d8-aba0-3a1425733875 nodeName:}" failed. No retries permitted until 2025-12-05 20:41:03.127022414 +0000 UTC m=+1833.356833792 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "9971832a-5acf-49d8-aba0-3a1425733875") : secret "watcher-kuttl-decision-engine-config-data" not found Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.500120 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.553003 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.559065 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.585718 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:59 crc kubenswrapper[4744]: E1205 20:40:59.586103 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="sg-core" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586122 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="sg-core" Dec 05 20:40:59 crc kubenswrapper[4744]: E1205 20:40:59.586142 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="ceilometer-central-agent" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586149 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="ceilometer-central-agent" Dec 05 20:40:59 crc kubenswrapper[4744]: E1205 20:40:59.586163 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" containerName="watcher-kuttl-api-log" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586171 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" containerName="watcher-kuttl-api-log" Dec 05 20:40:59 crc kubenswrapper[4744]: E1205 20:40:59.586188 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1295f8df-82f8-40d4-b7af-c1c2b2f03016" containerName="mariadb-account-delete" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586195 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1295f8df-82f8-40d4-b7af-c1c2b2f03016" containerName="mariadb-account-delete" Dec 05 20:40:59 crc kubenswrapper[4744]: E1205 20:40:59.586208 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c4b443-fda4-4eca-b1ad-4423b01e3aad" containerName="watcher-applier" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586215 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c4b443-fda4-4eca-b1ad-4423b01e3aad" containerName="watcher-applier" Dec 05 20:40:59 crc kubenswrapper[4744]: E1205 20:40:59.586228 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" containerName="watcher-api" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586236 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" containerName="watcher-api" Dec 05 20:40:59 crc kubenswrapper[4744]: E1205 20:40:59.586255 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="proxy-httpd" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586263 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="proxy-httpd" Dec 05 20:40:59 crc kubenswrapper[4744]: E1205 20:40:59.586283 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="ceilometer-notification-agent" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586312 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="ceilometer-notification-agent" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586514 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c4b443-fda4-4eca-b1ad-4423b01e3aad" containerName="watcher-applier" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586530 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="ceilometer-notification-agent" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586542 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" containerName="watcher-kuttl-api-log" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586560 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="proxy-httpd" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586574 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3ca55d-f6d9-4b59-81b4-4295f3b20d18" containerName="watcher-api" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586586 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="ceilometer-central-agent" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586605 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" containerName="sg-core" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.586614 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1295f8df-82f8-40d4-b7af-c1c2b2f03016" containerName="mariadb-account-delete" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.588422 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.594003 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.594166 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.594237 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.599252 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.737647 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.737690 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.737717 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-scripts\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.737819 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.737837 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mdpz\" (UniqueName: \"kubernetes.io/projected/5e63252d-7f8f-4399-ae89-40706313b337-kube-api-access-4mdpz\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.737855 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-config-data\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.737873 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e63252d-7f8f-4399-ae89-40706313b337-run-httpd\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.737980 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e63252d-7f8f-4399-ae89-40706313b337-log-httpd\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.839932 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.840075 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.840140 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-scripts\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.840351 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.840399 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mdpz\" (UniqueName: \"kubernetes.io/projected/5e63252d-7f8f-4399-ae89-40706313b337-kube-api-access-4mdpz\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.840451 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-config-data\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.840503 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e63252d-7f8f-4399-ae89-40706313b337-run-httpd\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.841114 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e63252d-7f8f-4399-ae89-40706313b337-run-httpd\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.841432 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e63252d-7f8f-4399-ae89-40706313b337-log-httpd\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.841744 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e63252d-7f8f-4399-ae89-40706313b337-log-httpd\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.846815 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-config-data\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.847379 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.848455 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-scripts\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.855472 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.856060 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.860870 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mdpz\" (UniqueName: \"kubernetes.io/projected/5e63252d-7f8f-4399-ae89-40706313b337-kube-api-access-4mdpz\") pod \"ceilometer-0\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:40:59 crc kubenswrapper[4744]: I1205 20:40:59.913509 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:00 crc kubenswrapper[4744]: I1205 20:41:00.092456 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07c4b443-fda4-4eca-b1ad-4423b01e3aad" path="/var/lib/kubelet/pods/07c4b443-fda4-4eca-b1ad-4423b01e3aad/volumes" Dec 05 20:41:00 crc kubenswrapper[4744]: I1205 20:41:00.093569 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a91ddff-ea50-4eed-a5c2-93f13b4abd56" path="/var/lib/kubelet/pods/7a91ddff-ea50-4eed-a5c2-93f13b4abd56/volumes" Dec 05 20:41:00 crc kubenswrapper[4744]: I1205 20:41:00.238155 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-sgc5p"] Dec 05 20:41:00 crc kubenswrapper[4744]: I1205 20:41:00.251539 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-sgc5p"] Dec 05 20:41:00 crc kubenswrapper[4744]: I1205 20:41:00.265448 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t"] Dec 05 20:41:00 crc kubenswrapper[4744]: I1205 20:41:00.270982 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-ab56-account-create-update-8gh9t"] Dec 05 20:41:00 crc kubenswrapper[4744]: I1205 20:41:00.277412 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcherab56-account-delete-95cgp"] Dec 05 20:41:00 crc kubenswrapper[4744]: I1205 20:41:00.282828 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcherab56-account-delete-95cgp"] Dec 05 20:41:00 crc kubenswrapper[4744]: I1205 20:41:00.427638 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:41:00 crc kubenswrapper[4744]: W1205 20:41:00.430229 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e63252d_7f8f_4399_ae89_40706313b337.slice/crio-50c4346679431578a31c178b5a8dacb9a2997da6638ad29785e9b8bf2e7cb961 WatchSource:0}: Error finding container 50c4346679431578a31c178b5a8dacb9a2997da6638ad29785e9b8bf2e7cb961: Status 404 returned error can't find the container with id 50c4346679431578a31c178b5a8dacb9a2997da6638ad29785e9b8bf2e7cb961 Dec 05 20:41:00 crc kubenswrapper[4744]: I1205 20:41:00.518036 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5e63252d-7f8f-4399-ae89-40706313b337","Type":"ContainerStarted","Data":"50c4346679431578a31c178b5a8dacb9a2997da6638ad29785e9b8bf2e7cb961"} Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.528153 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5e63252d-7f8f-4399-ae89-40706313b337","Type":"ContainerStarted","Data":"3f7008c507d8d6388d4fb2a7c8a28bddac099704ad7f4a376285a36efe9d6ebc"} Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.529639 4744 generic.go:334] "Generic (PLEG): container finished" podID="9971832a-5acf-49d8-aba0-3a1425733875" containerID="dd051490cc0ec5f3e16172044c3a2d78da102178845268cb3b2e646f3370019f" exitCode=0 Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.529666 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"9971832a-5acf-49d8-aba0-3a1425733875","Type":"ContainerDied","Data":"dd051490cc0ec5f3e16172044c3a2d78da102178845268cb3b2e646f3370019f"} Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.700538 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.883205 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9971832a-5acf-49d8-aba0-3a1425733875-logs\") pod \"9971832a-5acf-49d8-aba0-3a1425733875\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.883513 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9971832a-5acf-49d8-aba0-3a1425733875-logs" (OuterVolumeSpecName: "logs") pod "9971832a-5acf-49d8-aba0-3a1425733875" (UID: "9971832a-5acf-49d8-aba0-3a1425733875"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.883628 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-config-data\") pod \"9971832a-5acf-49d8-aba0-3a1425733875\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.883755 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-combined-ca-bundle\") pod \"9971832a-5acf-49d8-aba0-3a1425733875\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.883785 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-custom-prometheus-ca\") pod \"9971832a-5acf-49d8-aba0-3a1425733875\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.883854 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss9cw\" (UniqueName: \"kubernetes.io/projected/9971832a-5acf-49d8-aba0-3a1425733875-kube-api-access-ss9cw\") pod \"9971832a-5acf-49d8-aba0-3a1425733875\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.883930 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-cert-memcached-mtls\") pod \"9971832a-5acf-49d8-aba0-3a1425733875\" (UID: \"9971832a-5acf-49d8-aba0-3a1425733875\") " Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.884326 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9971832a-5acf-49d8-aba0-3a1425733875-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.889480 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9971832a-5acf-49d8-aba0-3a1425733875-kube-api-access-ss9cw" (OuterVolumeSpecName: "kube-api-access-ss9cw") pod "9971832a-5acf-49d8-aba0-3a1425733875" (UID: "9971832a-5acf-49d8-aba0-3a1425733875"). InnerVolumeSpecName "kube-api-access-ss9cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.913419 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9971832a-5acf-49d8-aba0-3a1425733875" (UID: "9971832a-5acf-49d8-aba0-3a1425733875"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.916072 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "9971832a-5acf-49d8-aba0-3a1425733875" (UID: "9971832a-5acf-49d8-aba0-3a1425733875"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.953425 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-config-data" (OuterVolumeSpecName: "config-data") pod "9971832a-5acf-49d8-aba0-3a1425733875" (UID: "9971832a-5acf-49d8-aba0-3a1425733875"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.981442 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "9971832a-5acf-49d8-aba0-3a1425733875" (UID: "9971832a-5acf-49d8-aba0-3a1425733875"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.985541 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.985572 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.985588 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.985600 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss9cw\" (UniqueName: \"kubernetes.io/projected/9971832a-5acf-49d8-aba0-3a1425733875-kube-api-access-ss9cw\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:01 crc kubenswrapper[4744]: I1205 20:41:01.985612 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9971832a-5acf-49d8-aba0-3a1425733875-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:02 crc kubenswrapper[4744]: I1205 20:41:02.088902 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0677bd03-1b77-4a98-8270-79816ee729bb" path="/var/lib/kubelet/pods/0677bd03-1b77-4a98-8270-79816ee729bb/volumes" Dec 05 20:41:02 crc kubenswrapper[4744]: I1205 20:41:02.089404 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1295f8df-82f8-40d4-b7af-c1c2b2f03016" path="/var/lib/kubelet/pods/1295f8df-82f8-40d4-b7af-c1c2b2f03016/volumes" Dec 05 20:41:02 crc kubenswrapper[4744]: I1205 20:41:02.089881 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83db5e02-d8cc-4e8b-88d4-f00b916e34fb" path="/var/lib/kubelet/pods/83db5e02-d8cc-4e8b-88d4-f00b916e34fb/volumes" Dec 05 20:41:02 crc kubenswrapper[4744]: I1205 20:41:02.539002 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"9971832a-5acf-49d8-aba0-3a1425733875","Type":"ContainerDied","Data":"3b965ca41f8554ee02d86346ca241cf8c43f598e39853a7ae9c564c16ba79d62"} Dec 05 20:41:02 crc kubenswrapper[4744]: I1205 20:41:02.539370 4744 scope.go:117] "RemoveContainer" containerID="dd051490cc0ec5f3e16172044c3a2d78da102178845268cb3b2e646f3370019f" Dec 05 20:41:02 crc kubenswrapper[4744]: I1205 20:41:02.539032 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:02 crc kubenswrapper[4744]: I1205 20:41:02.543108 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5e63252d-7f8f-4399-ae89-40706313b337","Type":"ContainerStarted","Data":"9399e321ccfe38a509f81c4a74b2cf2c3fe5d8186d60c863a8b702409056fd0e"} Dec 05 20:41:02 crc kubenswrapper[4744]: I1205 20:41:02.574409 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:41:02 crc kubenswrapper[4744]: I1205 20:41:02.585158 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.556404 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5e63252d-7f8f-4399-ae89-40706313b337","Type":"ContainerStarted","Data":"ca4c8dbede0240eb2d7ad4b4f643c6b8599b160c796125178bb936319707aef5"} Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.687004 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-fp9n8"] Dec 05 20:41:03 crc kubenswrapper[4744]: E1205 20:41:03.687555 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9971832a-5acf-49d8-aba0-3a1425733875" containerName="watcher-decision-engine" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.687567 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9971832a-5acf-49d8-aba0-3a1425733875" containerName="watcher-decision-engine" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.687728 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9971832a-5acf-49d8-aba0-3a1425733875" containerName="watcher-decision-engine" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.688253 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-fp9n8" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.710355 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c"] Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.716108 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.718721 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.731977 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-fp9n8"] Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.746558 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c"] Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.819267 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33ee60fc-4c24-4634-ac99-a46bb500f280-operator-scripts\") pod \"watcher-db-create-fp9n8\" (UID: \"33ee60fc-4c24-4634-ac99-a46bb500f280\") " pod="watcher-kuttl-default/watcher-db-create-fp9n8" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.819347 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2cjn\" (UniqueName: \"kubernetes.io/projected/ed7ecb82-da56-4634-91db-8dbe745cb6f7-kube-api-access-c2cjn\") pod \"watcher-cf3b-account-create-update-44q7c\" (UID: \"ed7ecb82-da56-4634-91db-8dbe745cb6f7\") " pod="watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.819405 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8w4c\" (UniqueName: \"kubernetes.io/projected/33ee60fc-4c24-4634-ac99-a46bb500f280-kube-api-access-b8w4c\") pod \"watcher-db-create-fp9n8\" (UID: \"33ee60fc-4c24-4634-ac99-a46bb500f280\") " pod="watcher-kuttl-default/watcher-db-create-fp9n8" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.819467 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7ecb82-da56-4634-91db-8dbe745cb6f7-operator-scripts\") pod \"watcher-cf3b-account-create-update-44q7c\" (UID: \"ed7ecb82-da56-4634-91db-8dbe745cb6f7\") " pod="watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.920606 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8w4c\" (UniqueName: \"kubernetes.io/projected/33ee60fc-4c24-4634-ac99-a46bb500f280-kube-api-access-b8w4c\") pod \"watcher-db-create-fp9n8\" (UID: \"33ee60fc-4c24-4634-ac99-a46bb500f280\") " pod="watcher-kuttl-default/watcher-db-create-fp9n8" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.920722 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7ecb82-da56-4634-91db-8dbe745cb6f7-operator-scripts\") pod \"watcher-cf3b-account-create-update-44q7c\" (UID: \"ed7ecb82-da56-4634-91db-8dbe745cb6f7\") " pod="watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.920770 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33ee60fc-4c24-4634-ac99-a46bb500f280-operator-scripts\") pod \"watcher-db-create-fp9n8\" (UID: \"33ee60fc-4c24-4634-ac99-a46bb500f280\") " pod="watcher-kuttl-default/watcher-db-create-fp9n8" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.920788 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2cjn\" (UniqueName: \"kubernetes.io/projected/ed7ecb82-da56-4634-91db-8dbe745cb6f7-kube-api-access-c2cjn\") pod \"watcher-cf3b-account-create-update-44q7c\" (UID: \"ed7ecb82-da56-4634-91db-8dbe745cb6f7\") " pod="watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.921901 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7ecb82-da56-4634-91db-8dbe745cb6f7-operator-scripts\") pod \"watcher-cf3b-account-create-update-44q7c\" (UID: \"ed7ecb82-da56-4634-91db-8dbe745cb6f7\") " pod="watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.921981 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33ee60fc-4c24-4634-ac99-a46bb500f280-operator-scripts\") pod \"watcher-db-create-fp9n8\" (UID: \"33ee60fc-4c24-4634-ac99-a46bb500f280\") " pod="watcher-kuttl-default/watcher-db-create-fp9n8" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.939573 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8w4c\" (UniqueName: \"kubernetes.io/projected/33ee60fc-4c24-4634-ac99-a46bb500f280-kube-api-access-b8w4c\") pod \"watcher-db-create-fp9n8\" (UID: \"33ee60fc-4c24-4634-ac99-a46bb500f280\") " pod="watcher-kuttl-default/watcher-db-create-fp9n8" Dec 05 20:41:03 crc kubenswrapper[4744]: I1205 20:41:03.943169 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2cjn\" (UniqueName: \"kubernetes.io/projected/ed7ecb82-da56-4634-91db-8dbe745cb6f7-kube-api-access-c2cjn\") pod \"watcher-cf3b-account-create-update-44q7c\" (UID: \"ed7ecb82-da56-4634-91db-8dbe745cb6f7\") " pod="watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c" Dec 05 20:41:04 crc kubenswrapper[4744]: I1205 20:41:04.007747 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-fp9n8" Dec 05 20:41:04 crc kubenswrapper[4744]: I1205 20:41:04.043454 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c" Dec 05 20:41:04 crc kubenswrapper[4744]: I1205 20:41:04.081514 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:41:04 crc kubenswrapper[4744]: E1205 20:41:04.081914 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:41:04 crc kubenswrapper[4744]: I1205 20:41:04.098681 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9971832a-5acf-49d8-aba0-3a1425733875" path="/var/lib/kubelet/pods/9971832a-5acf-49d8-aba0-3a1425733875/volumes" Dec 05 20:41:04 crc kubenswrapper[4744]: I1205 20:41:04.489136 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-fp9n8"] Dec 05 20:41:04 crc kubenswrapper[4744]: W1205 20:41:04.501739 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33ee60fc_4c24_4634_ac99_a46bb500f280.slice/crio-e50e3d325f2cdd8659f70be68e85e3cb7571e9440ee819b7e9a353551cf7d1e7 WatchSource:0}: Error finding container e50e3d325f2cdd8659f70be68e85e3cb7571e9440ee819b7e9a353551cf7d1e7: Status 404 returned error can't find the container with id e50e3d325f2cdd8659f70be68e85e3cb7571e9440ee819b7e9a353551cf7d1e7 Dec 05 20:41:04 crc kubenswrapper[4744]: I1205 20:41:04.644163 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-fp9n8" event={"ID":"33ee60fc-4c24-4634-ac99-a46bb500f280","Type":"ContainerStarted","Data":"e50e3d325f2cdd8659f70be68e85e3cb7571e9440ee819b7e9a353551cf7d1e7"} Dec 05 20:41:04 crc kubenswrapper[4744]: I1205 20:41:04.671465 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c"] Dec 05 20:41:04 crc kubenswrapper[4744]: I1205 20:41:04.682364 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5e63252d-7f8f-4399-ae89-40706313b337","Type":"ContainerStarted","Data":"0b84edf3841095c52db7070f12d64d4d9fc3a5f7b53b2635d69a14c81ec7bc95"} Dec 05 20:41:04 crc kubenswrapper[4744]: I1205 20:41:04.683639 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:04 crc kubenswrapper[4744]: I1205 20:41:04.717131 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.625704573 podStartE2EDuration="5.71711612s" podCreationTimestamp="2025-12-05 20:40:59 +0000 UTC" firstStartedPulling="2025-12-05 20:41:00.433318815 +0000 UTC m=+1830.663130213" lastFinishedPulling="2025-12-05 20:41:03.524730382 +0000 UTC m=+1833.754541760" observedRunningTime="2025-12-05 20:41:04.715422909 +0000 UTC m=+1834.945234277" watchObservedRunningTime="2025-12-05 20:41:04.71711612 +0000 UTC m=+1834.946927488" Dec 05 20:41:05 crc kubenswrapper[4744]: I1205 20:41:05.691454 4744 generic.go:334] "Generic (PLEG): container finished" podID="ed7ecb82-da56-4634-91db-8dbe745cb6f7" containerID="c705812d17e73610015cb6aeab2284831a6fb05e3b6b382daae8950465eb9fb3" exitCode=0 Dec 05 20:41:05 crc kubenswrapper[4744]: I1205 20:41:05.691888 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c" event={"ID":"ed7ecb82-da56-4634-91db-8dbe745cb6f7","Type":"ContainerDied","Data":"c705812d17e73610015cb6aeab2284831a6fb05e3b6b382daae8950465eb9fb3"} Dec 05 20:41:05 crc kubenswrapper[4744]: I1205 20:41:05.691933 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c" event={"ID":"ed7ecb82-da56-4634-91db-8dbe745cb6f7","Type":"ContainerStarted","Data":"a3c1a697360eb2db08f67072f7fd35a84a72c556002d063bbe84e015e38a7431"} Dec 05 20:41:05 crc kubenswrapper[4744]: I1205 20:41:05.695741 4744 generic.go:334] "Generic (PLEG): container finished" podID="33ee60fc-4c24-4634-ac99-a46bb500f280" containerID="5829861bcdbcca036d9431fd67071926377ab10428bbb9a0386e222665934602" exitCode=0 Dec 05 20:41:05 crc kubenswrapper[4744]: I1205 20:41:05.695915 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-fp9n8" event={"ID":"33ee60fc-4c24-4634-ac99-a46bb500f280","Type":"ContainerDied","Data":"5829861bcdbcca036d9431fd67071926377ab10428bbb9a0386e222665934602"} Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.118318 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-fp9n8" Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.226329 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c" Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.295762 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33ee60fc-4c24-4634-ac99-a46bb500f280-operator-scripts\") pod \"33ee60fc-4c24-4634-ac99-a46bb500f280\" (UID: \"33ee60fc-4c24-4634-ac99-a46bb500f280\") " Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.296198 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ee60fc-4c24-4634-ac99-a46bb500f280-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33ee60fc-4c24-4634-ac99-a46bb500f280" (UID: "33ee60fc-4c24-4634-ac99-a46bb500f280"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.296518 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7ecb82-da56-4634-91db-8dbe745cb6f7-operator-scripts\") pod \"ed7ecb82-da56-4634-91db-8dbe745cb6f7\" (UID: \"ed7ecb82-da56-4634-91db-8dbe745cb6f7\") " Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.296564 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2cjn\" (UniqueName: \"kubernetes.io/projected/ed7ecb82-da56-4634-91db-8dbe745cb6f7-kube-api-access-c2cjn\") pod \"ed7ecb82-da56-4634-91db-8dbe745cb6f7\" (UID: \"ed7ecb82-da56-4634-91db-8dbe745cb6f7\") " Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.296597 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8w4c\" (UniqueName: \"kubernetes.io/projected/33ee60fc-4c24-4634-ac99-a46bb500f280-kube-api-access-b8w4c\") pod \"33ee60fc-4c24-4634-ac99-a46bb500f280\" (UID: \"33ee60fc-4c24-4634-ac99-a46bb500f280\") " Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.296943 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33ee60fc-4c24-4634-ac99-a46bb500f280-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.297860 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed7ecb82-da56-4634-91db-8dbe745cb6f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed7ecb82-da56-4634-91db-8dbe745cb6f7" (UID: "ed7ecb82-da56-4634-91db-8dbe745cb6f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.315667 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed7ecb82-da56-4634-91db-8dbe745cb6f7-kube-api-access-c2cjn" (OuterVolumeSpecName: "kube-api-access-c2cjn") pod "ed7ecb82-da56-4634-91db-8dbe745cb6f7" (UID: "ed7ecb82-da56-4634-91db-8dbe745cb6f7"). InnerVolumeSpecName "kube-api-access-c2cjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.315733 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ee60fc-4c24-4634-ac99-a46bb500f280-kube-api-access-b8w4c" (OuterVolumeSpecName: "kube-api-access-b8w4c") pod "33ee60fc-4c24-4634-ac99-a46bb500f280" (UID: "33ee60fc-4c24-4634-ac99-a46bb500f280"). InnerVolumeSpecName "kube-api-access-b8w4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.403521 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7ecb82-da56-4634-91db-8dbe745cb6f7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.403553 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2cjn\" (UniqueName: \"kubernetes.io/projected/ed7ecb82-da56-4634-91db-8dbe745cb6f7-kube-api-access-c2cjn\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.403564 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8w4c\" (UniqueName: \"kubernetes.io/projected/33ee60fc-4c24-4634-ac99-a46bb500f280-kube-api-access-b8w4c\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.719654 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c" event={"ID":"ed7ecb82-da56-4634-91db-8dbe745cb6f7","Type":"ContainerDied","Data":"a3c1a697360eb2db08f67072f7fd35a84a72c556002d063bbe84e015e38a7431"} Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.719693 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3c1a697360eb2db08f67072f7fd35a84a72c556002d063bbe84e015e38a7431" Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.719752 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c" Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.731367 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-fp9n8" event={"ID":"33ee60fc-4c24-4634-ac99-a46bb500f280","Type":"ContainerDied","Data":"e50e3d325f2cdd8659f70be68e85e3cb7571e9440ee819b7e9a353551cf7d1e7"} Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.731460 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e50e3d325f2cdd8659f70be68e85e3cb7571e9440ee819b7e9a353551cf7d1e7" Dec 05 20:41:07 crc kubenswrapper[4744]: I1205 20:41:07.731548 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-fp9n8" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.022704 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-5npzz"] Dec 05 20:41:09 crc kubenswrapper[4744]: E1205 20:41:09.023064 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ee60fc-4c24-4634-ac99-a46bb500f280" containerName="mariadb-database-create" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.023079 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ee60fc-4c24-4634-ac99-a46bb500f280" containerName="mariadb-database-create" Dec 05 20:41:09 crc kubenswrapper[4744]: E1205 20:41:09.023110 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed7ecb82-da56-4634-91db-8dbe745cb6f7" containerName="mariadb-account-create-update" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.023116 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7ecb82-da56-4634-91db-8dbe745cb6f7" containerName="mariadb-account-create-update" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.023249 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ee60fc-4c24-4634-ac99-a46bb500f280" containerName="mariadb-database-create" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.023270 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed7ecb82-da56-4634-91db-8dbe745cb6f7" containerName="mariadb-account-create-update" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.023831 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.027260 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.027758 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-jfzc2" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.033889 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-5npzz"] Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.128901 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-597g2\" (UniqueName: \"kubernetes.io/projected/66ab5b76-8301-4e9f-9aa7-b009652944e1-kube-api-access-597g2\") pod \"watcher-kuttl-db-sync-5npzz\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.129456 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-db-sync-config-data\") pod \"watcher-kuttl-db-sync-5npzz\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.129538 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-config-data\") pod \"watcher-kuttl-db-sync-5npzz\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.129781 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-5npzz\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.231230 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-5npzz\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.231351 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-597g2\" (UniqueName: \"kubernetes.io/projected/66ab5b76-8301-4e9f-9aa7-b009652944e1-kube-api-access-597g2\") pod \"watcher-kuttl-db-sync-5npzz\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.231397 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-db-sync-config-data\") pod \"watcher-kuttl-db-sync-5npzz\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.231484 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-config-data\") pod \"watcher-kuttl-db-sync-5npzz\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.240948 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-db-sync-config-data\") pod \"watcher-kuttl-db-sync-5npzz\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.240999 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-5npzz\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.248406 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-config-data\") pod \"watcher-kuttl-db-sync-5npzz\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.258053 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-597g2\" (UniqueName: \"kubernetes.io/projected/66ab5b76-8301-4e9f-9aa7-b009652944e1-kube-api-access-597g2\") pod \"watcher-kuttl-db-sync-5npzz\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.384205 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:09 crc kubenswrapper[4744]: I1205 20:41:09.853941 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-5npzz"] Dec 05 20:41:10 crc kubenswrapper[4744]: I1205 20:41:10.766220 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" event={"ID":"66ab5b76-8301-4e9f-9aa7-b009652944e1","Type":"ContainerStarted","Data":"bf20b69d955132a45cc905c13bf5975c2a90b83b437706c3331f0e6940247a7e"} Dec 05 20:41:10 crc kubenswrapper[4744]: I1205 20:41:10.766661 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" event={"ID":"66ab5b76-8301-4e9f-9aa7-b009652944e1","Type":"ContainerStarted","Data":"a3af5115cdaca6e1cfd2106fc5d47191972950d3f95643a02b674130957f5aaf"} Dec 05 20:41:10 crc kubenswrapper[4744]: I1205 20:41:10.783552 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" podStartSLOduration=1.783537501 podStartE2EDuration="1.783537501s" podCreationTimestamp="2025-12-05 20:41:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:41:10.781715316 +0000 UTC m=+1841.011526674" watchObservedRunningTime="2025-12-05 20:41:10.783537501 +0000 UTC m=+1841.013348869" Dec 05 20:41:12 crc kubenswrapper[4744]: I1205 20:41:12.792138 4744 generic.go:334] "Generic (PLEG): container finished" podID="66ab5b76-8301-4e9f-9aa7-b009652944e1" containerID="bf20b69d955132a45cc905c13bf5975c2a90b83b437706c3331f0e6940247a7e" exitCode=0 Dec 05 20:41:12 crc kubenswrapper[4744]: I1205 20:41:12.792213 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" event={"ID":"66ab5b76-8301-4e9f-9aa7-b009652944e1","Type":"ContainerDied","Data":"bf20b69d955132a45cc905c13bf5975c2a90b83b437706c3331f0e6940247a7e"} Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.153188 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.319373 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-597g2\" (UniqueName: \"kubernetes.io/projected/66ab5b76-8301-4e9f-9aa7-b009652944e1-kube-api-access-597g2\") pod \"66ab5b76-8301-4e9f-9aa7-b009652944e1\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.319462 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-config-data\") pod \"66ab5b76-8301-4e9f-9aa7-b009652944e1\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.319680 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-combined-ca-bundle\") pod \"66ab5b76-8301-4e9f-9aa7-b009652944e1\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.319753 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-db-sync-config-data\") pod \"66ab5b76-8301-4e9f-9aa7-b009652944e1\" (UID: \"66ab5b76-8301-4e9f-9aa7-b009652944e1\") " Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.339644 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "66ab5b76-8301-4e9f-9aa7-b009652944e1" (UID: "66ab5b76-8301-4e9f-9aa7-b009652944e1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.347004 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ab5b76-8301-4e9f-9aa7-b009652944e1-kube-api-access-597g2" (OuterVolumeSpecName: "kube-api-access-597g2") pod "66ab5b76-8301-4e9f-9aa7-b009652944e1" (UID: "66ab5b76-8301-4e9f-9aa7-b009652944e1"). InnerVolumeSpecName "kube-api-access-597g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.352177 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66ab5b76-8301-4e9f-9aa7-b009652944e1" (UID: "66ab5b76-8301-4e9f-9aa7-b009652944e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.375723 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-config-data" (OuterVolumeSpecName: "config-data") pod "66ab5b76-8301-4e9f-9aa7-b009652944e1" (UID: "66ab5b76-8301-4e9f-9aa7-b009652944e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.421851 4744 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.421902 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-597g2\" (UniqueName: \"kubernetes.io/projected/66ab5b76-8301-4e9f-9aa7-b009652944e1-kube-api-access-597g2\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.421922 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.421938 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ab5b76-8301-4e9f-9aa7-b009652944e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.816123 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" event={"ID":"66ab5b76-8301-4e9f-9aa7-b009652944e1","Type":"ContainerDied","Data":"a3af5115cdaca6e1cfd2106fc5d47191972950d3f95643a02b674130957f5aaf"} Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.816180 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3af5115cdaca6e1cfd2106fc5d47191972950d3f95643a02b674130957f5aaf" Dec 05 20:41:14 crc kubenswrapper[4744]: I1205 20:41:14.816224 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-5npzz" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.081343 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:41:15 crc kubenswrapper[4744]: E1205 20:41:15.081783 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.229349 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:41:15 crc kubenswrapper[4744]: E1205 20:41:15.229739 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ab5b76-8301-4e9f-9aa7-b009652944e1" containerName="watcher-kuttl-db-sync" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.229754 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ab5b76-8301-4e9f-9aa7-b009652944e1" containerName="watcher-kuttl-db-sync" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.229941 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ab5b76-8301-4e9f-9aa7-b009652944e1" containerName="watcher-kuttl-db-sync" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.230601 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.235747 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-jfzc2" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.265527 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.275452 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.278017 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.279601 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.285115 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.315858 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.335637 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.335703 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee659ec0-fc2e-4720-bc81-416ad0498280-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.335724 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.335781 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.335816 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqppd\" (UniqueName: \"kubernetes.io/projected/ee659ec0-fc2e-4720-bc81-416ad0498280-kube-api-access-nqppd\") pod \"watcher-kuttl-applier-0\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.370354 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.371738 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.378453 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.378499 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.437049 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.437099 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.437135 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.437151 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1183185b-c3f3-47ed-b168-408c077efcbb-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.437175 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqppd\" (UniqueName: \"kubernetes.io/projected/ee659ec0-fc2e-4720-bc81-416ad0498280-kube-api-access-nqppd\") pod \"watcher-kuttl-applier-0\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.437200 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.437217 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.437274 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.437305 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47s4x\" (UniqueName: \"kubernetes.io/projected/1183185b-c3f3-47ed-b168-408c077efcbb-kube-api-access-47s4x\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.437322 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee659ec0-fc2e-4720-bc81-416ad0498280-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.437340 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.438624 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee659ec0-fc2e-4720-bc81-416ad0498280-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.446020 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.446547 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.463125 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.481879 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqppd\" (UniqueName: \"kubernetes.io/projected/ee659ec0-fc2e-4720-bc81-416ad0498280-kube-api-access-nqppd\") pod \"watcher-kuttl-applier-0\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.538233 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.538280 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjndt\" (UniqueName: \"kubernetes.io/projected/7c510803-2f8d-422b-b970-ebcc1217fab1-kube-api-access-gjndt\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.538320 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47s4x\" (UniqueName: \"kubernetes.io/projected/1183185b-c3f3-47ed-b168-408c077efcbb-kube-api-access-47s4x\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.538437 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.538478 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.538517 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.538540 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1183185b-c3f3-47ed-b168-408c077efcbb-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.538567 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.538698 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c510803-2f8d-422b-b970-ebcc1217fab1-logs\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.538731 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.538755 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.538801 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.538945 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1183185b-c3f3-47ed-b168-408c077efcbb-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.543991 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.547048 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.554991 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.558570 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.562768 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47s4x\" (UniqueName: \"kubernetes.io/projected/1183185b-c3f3-47ed-b168-408c077efcbb-kube-api-access-47s4x\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.591772 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.622382 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.640746 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.640794 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjndt\" (UniqueName: \"kubernetes.io/projected/7c510803-2f8d-422b-b970-ebcc1217fab1-kube-api-access-gjndt\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.640860 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.640885 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.640904 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c510803-2f8d-422b-b970-ebcc1217fab1-logs\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.640957 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.641411 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c510803-2f8d-422b-b970-ebcc1217fab1-logs\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.645622 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.646185 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.646454 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.648188 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.659954 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjndt\" (UniqueName: \"kubernetes.io/projected/7c510803-2f8d-422b-b970-ebcc1217fab1-kube-api-access-gjndt\") pod \"watcher-kuttl-api-0\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:15 crc kubenswrapper[4744]: I1205 20:41:15.684830 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:16 crc kubenswrapper[4744]: I1205 20:41:16.105036 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:41:16 crc kubenswrapper[4744]: I1205 20:41:16.117359 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:41:16 crc kubenswrapper[4744]: W1205 20:41:16.272745 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c510803_2f8d_422b_b970_ebcc1217fab1.slice/crio-7e783f659b05b3bfe550cf63aa0f623e115b22b91a97efb8d557c7f20d316740 WatchSource:0}: Error finding container 7e783f659b05b3bfe550cf63aa0f623e115b22b91a97efb8d557c7f20d316740: Status 404 returned error can't find the container with id 7e783f659b05b3bfe550cf63aa0f623e115b22b91a97efb8d557c7f20d316740 Dec 05 20:41:16 crc kubenswrapper[4744]: I1205 20:41:16.274854 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:41:16 crc kubenswrapper[4744]: I1205 20:41:16.846816 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"1183185b-c3f3-47ed-b168-408c077efcbb","Type":"ContainerStarted","Data":"f1ab19e4cf29a766ed8a86313dd78b4c4202f0e9b136c9707e3744294e001663"} Dec 05 20:41:16 crc kubenswrapper[4744]: I1205 20:41:16.847128 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"1183185b-c3f3-47ed-b168-408c077efcbb","Type":"ContainerStarted","Data":"d81d3fe70cb87998fba8a3634360e4c5045d3b4735d12c548c5f1431aa8799b2"} Dec 05 20:41:16 crc kubenswrapper[4744]: I1205 20:41:16.849645 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"7c510803-2f8d-422b-b970-ebcc1217fab1","Type":"ContainerStarted","Data":"da071c12ccb3cf7a2b46363a5a49e8aa50a0e17eef45a9f9425b47876ffcc66e"} Dec 05 20:41:16 crc kubenswrapper[4744]: I1205 20:41:16.849690 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"7c510803-2f8d-422b-b970-ebcc1217fab1","Type":"ContainerStarted","Data":"2b6c8e2595a2802f59dc32a59a5a227c97ff669d7b03cf3f1befc6f81bef5878"} Dec 05 20:41:16 crc kubenswrapper[4744]: I1205 20:41:16.849706 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"7c510803-2f8d-422b-b970-ebcc1217fab1","Type":"ContainerStarted","Data":"7e783f659b05b3bfe550cf63aa0f623e115b22b91a97efb8d557c7f20d316740"} Dec 05 20:41:16 crc kubenswrapper[4744]: I1205 20:41:16.849805 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:16 crc kubenswrapper[4744]: I1205 20:41:16.851651 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"ee659ec0-fc2e-4720-bc81-416ad0498280","Type":"ContainerStarted","Data":"be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254"} Dec 05 20:41:16 crc kubenswrapper[4744]: I1205 20:41:16.851695 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"ee659ec0-fc2e-4720-bc81-416ad0498280","Type":"ContainerStarted","Data":"ffd116d8de1466771b7d739da917eb1a1a55eeacb8eb08756cd3083f8e46c1c1"} Dec 05 20:41:16 crc kubenswrapper[4744]: I1205 20:41:16.870522 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.870508784 podStartE2EDuration="1.870508784s" podCreationTimestamp="2025-12-05 20:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:41:16.865258635 +0000 UTC m=+1847.095070003" watchObservedRunningTime="2025-12-05 20:41:16.870508784 +0000 UTC m=+1847.100320152" Dec 05 20:41:16 crc kubenswrapper[4744]: I1205 20:41:16.894580 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=1.894556151 podStartE2EDuration="1.894556151s" podCreationTimestamp="2025-12-05 20:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:41:16.893203078 +0000 UTC m=+1847.123014446" watchObservedRunningTime="2025-12-05 20:41:16.894556151 +0000 UTC m=+1847.124367519" Dec 05 20:41:16 crc kubenswrapper[4744]: I1205 20:41:16.912501 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=1.912486859 podStartE2EDuration="1.912486859s" podCreationTimestamp="2025-12-05 20:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:41:16.906942884 +0000 UTC m=+1847.136754252" watchObservedRunningTime="2025-12-05 20:41:16.912486859 +0000 UTC m=+1847.142298227" Dec 05 20:41:19 crc kubenswrapper[4744]: I1205 20:41:19.077510 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:20 crc kubenswrapper[4744]: I1205 20:41:20.591935 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:20 crc kubenswrapper[4744]: I1205 20:41:20.684915 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:25 crc kubenswrapper[4744]: I1205 20:41:25.592112 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:25 crc kubenswrapper[4744]: I1205 20:41:25.623519 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:25 crc kubenswrapper[4744]: I1205 20:41:25.637315 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:25 crc kubenswrapper[4744]: I1205 20:41:25.674148 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:25 crc kubenswrapper[4744]: I1205 20:41:25.685577 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:25 crc kubenswrapper[4744]: I1205 20:41:25.692087 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:25 crc kubenswrapper[4744]: I1205 20:41:25.939863 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:25 crc kubenswrapper[4744]: I1205 20:41:25.942841 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:25 crc kubenswrapper[4744]: I1205 20:41:25.984180 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:25 crc kubenswrapper[4744]: I1205 20:41:25.993305 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:28 crc kubenswrapper[4744]: I1205 20:41:28.099745 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:41:28 crc kubenswrapper[4744]: I1205 20:41:28.100568 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="ceilometer-central-agent" containerID="cri-o://3f7008c507d8d6388d4fb2a7c8a28bddac099704ad7f4a376285a36efe9d6ebc" gracePeriod=30 Dec 05 20:41:28 crc kubenswrapper[4744]: I1205 20:41:28.100720 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="proxy-httpd" containerID="cri-o://0b84edf3841095c52db7070f12d64d4d9fc3a5f7b53b2635d69a14c81ec7bc95" gracePeriod=30 Dec 05 20:41:28 crc kubenswrapper[4744]: I1205 20:41:28.100770 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="sg-core" containerID="cri-o://ca4c8dbede0240eb2d7ad4b4f643c6b8599b160c796125178bb936319707aef5" gracePeriod=30 Dec 05 20:41:28 crc kubenswrapper[4744]: I1205 20:41:28.100817 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="ceilometer-notification-agent" containerID="cri-o://9399e321ccfe38a509f81c4a74b2cf2c3fe5d8186d60c863a8b702409056fd0e" gracePeriod=30 Dec 05 20:41:28 crc kubenswrapper[4744]: I1205 20:41:28.116069 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 05 20:41:28 crc kubenswrapper[4744]: I1205 20:41:28.969541 4744 generic.go:334] "Generic (PLEG): container finished" podID="5e63252d-7f8f-4399-ae89-40706313b337" containerID="0b84edf3841095c52db7070f12d64d4d9fc3a5f7b53b2635d69a14c81ec7bc95" exitCode=0 Dec 05 20:41:28 crc kubenswrapper[4744]: I1205 20:41:28.969904 4744 generic.go:334] "Generic (PLEG): container finished" podID="5e63252d-7f8f-4399-ae89-40706313b337" containerID="ca4c8dbede0240eb2d7ad4b4f643c6b8599b160c796125178bb936319707aef5" exitCode=2 Dec 05 20:41:28 crc kubenswrapper[4744]: I1205 20:41:28.969918 4744 generic.go:334] "Generic (PLEG): container finished" podID="5e63252d-7f8f-4399-ae89-40706313b337" containerID="3f7008c507d8d6388d4fb2a7c8a28bddac099704ad7f4a376285a36efe9d6ebc" exitCode=0 Dec 05 20:41:28 crc kubenswrapper[4744]: I1205 20:41:28.969591 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5e63252d-7f8f-4399-ae89-40706313b337","Type":"ContainerDied","Data":"0b84edf3841095c52db7070f12d64d4d9fc3a5f7b53b2635d69a14c81ec7bc95"} Dec 05 20:41:28 crc kubenswrapper[4744]: I1205 20:41:28.969962 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5e63252d-7f8f-4399-ae89-40706313b337","Type":"ContainerDied","Data":"ca4c8dbede0240eb2d7ad4b4f643c6b8599b160c796125178bb936319707aef5"} Dec 05 20:41:28 crc kubenswrapper[4744]: I1205 20:41:28.969982 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5e63252d-7f8f-4399-ae89-40706313b337","Type":"ContainerDied","Data":"3f7008c507d8d6388d4fb2a7c8a28bddac099704ad7f4a376285a36efe9d6ebc"} Dec 05 20:41:29 crc kubenswrapper[4744]: I1205 20:41:29.080489 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:41:29 crc kubenswrapper[4744]: E1205 20:41:29.080810 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:41:29 crc kubenswrapper[4744]: I1205 20:41:29.915150 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.209:3000/\": dial tcp 10.217.0.209:3000: connect: connection refused" Dec 05 20:41:32 crc kubenswrapper[4744]: I1205 20:41:32.990857 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-5npzz"] Dec 05 20:41:32 crc kubenswrapper[4744]: I1205 20:41:32.997269 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-5npzz"] Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.034156 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watchercf3b-account-delete-vbtk2"] Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.036160 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchercf3b-account-delete-vbtk2" Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.052464 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchercf3b-account-delete-vbtk2"] Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.122602 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29562ca3-c750-42df-bc4d-8e7958280481-operator-scripts\") pod \"watchercf3b-account-delete-vbtk2\" (UID: \"29562ca3-c750-42df-bc4d-8e7958280481\") " pod="watcher-kuttl-default/watchercf3b-account-delete-vbtk2" Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.122684 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh9tj\" (UniqueName: \"kubernetes.io/projected/29562ca3-c750-42df-bc4d-8e7958280481-kube-api-access-sh9tj\") pod \"watchercf3b-account-delete-vbtk2\" (UID: \"29562ca3-c750-42df-bc4d-8e7958280481\") " pod="watcher-kuttl-default/watchercf3b-account-delete-vbtk2" Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.137458 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.137681 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="ee659ec0-fc2e-4720-bc81-416ad0498280" containerName="watcher-applier" containerID="cri-o://be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254" gracePeriod=30 Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.159003 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.159221 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="7c510803-2f8d-422b-b970-ebcc1217fab1" containerName="watcher-kuttl-api-log" containerID="cri-o://2b6c8e2595a2802f59dc32a59a5a227c97ff669d7b03cf3f1befc6f81bef5878" gracePeriod=30 Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.160145 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="7c510803-2f8d-422b-b970-ebcc1217fab1" containerName="watcher-api" containerID="cri-o://da071c12ccb3cf7a2b46363a5a49e8aa50a0e17eef45a9f9425b47876ffcc66e" gracePeriod=30 Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.182559 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.182797 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="1183185b-c3f3-47ed-b168-408c077efcbb" containerName="watcher-decision-engine" containerID="cri-o://f1ab19e4cf29a766ed8a86313dd78b4c4202f0e9b136c9707e3744294e001663" gracePeriod=30 Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.223534 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh9tj\" (UniqueName: \"kubernetes.io/projected/29562ca3-c750-42df-bc4d-8e7958280481-kube-api-access-sh9tj\") pod \"watchercf3b-account-delete-vbtk2\" (UID: \"29562ca3-c750-42df-bc4d-8e7958280481\") " pod="watcher-kuttl-default/watchercf3b-account-delete-vbtk2" Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.223661 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29562ca3-c750-42df-bc4d-8e7958280481-operator-scripts\") pod \"watchercf3b-account-delete-vbtk2\" (UID: \"29562ca3-c750-42df-bc4d-8e7958280481\") " pod="watcher-kuttl-default/watchercf3b-account-delete-vbtk2" Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.224441 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29562ca3-c750-42df-bc4d-8e7958280481-operator-scripts\") pod \"watchercf3b-account-delete-vbtk2\" (UID: \"29562ca3-c750-42df-bc4d-8e7958280481\") " pod="watcher-kuttl-default/watchercf3b-account-delete-vbtk2" Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.252032 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh9tj\" (UniqueName: \"kubernetes.io/projected/29562ca3-c750-42df-bc4d-8e7958280481-kube-api-access-sh9tj\") pod \"watchercf3b-account-delete-vbtk2\" (UID: \"29562ca3-c750-42df-bc4d-8e7958280481\") " pod="watcher-kuttl-default/watchercf3b-account-delete-vbtk2" Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.351144 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchercf3b-account-delete-vbtk2" Dec 05 20:41:33 crc kubenswrapper[4744]: I1205 20:41:33.953687 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchercf3b-account-delete-vbtk2"] Dec 05 20:41:33 crc kubenswrapper[4744]: W1205 20:41:33.954548 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29562ca3_c750_42df_bc4d_8e7958280481.slice/crio-68575be92b7bfbb8cd583e9449b29ace5e8c3e2f59792173917e487583cb4202 WatchSource:0}: Error finding container 68575be92b7bfbb8cd583e9449b29ace5e8c3e2f59792173917e487583cb4202: Status 404 returned error can't find the container with id 68575be92b7bfbb8cd583e9449b29ace5e8c3e2f59792173917e487583cb4202 Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.015627 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.017149 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchercf3b-account-delete-vbtk2" event={"ID":"29562ca3-c750-42df-bc4d-8e7958280481","Type":"ContainerStarted","Data":"68575be92b7bfbb8cd583e9449b29ace5e8c3e2f59792173917e487583cb4202"} Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.020517 4744 generic.go:334] "Generic (PLEG): container finished" podID="5e63252d-7f8f-4399-ae89-40706313b337" containerID="9399e321ccfe38a509f81c4a74b2cf2c3fe5d8186d60c863a8b702409056fd0e" exitCode=0 Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.020565 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.020603 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5e63252d-7f8f-4399-ae89-40706313b337","Type":"ContainerDied","Data":"9399e321ccfe38a509f81c4a74b2cf2c3fe5d8186d60c863a8b702409056fd0e"} Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.020672 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5e63252d-7f8f-4399-ae89-40706313b337","Type":"ContainerDied","Data":"50c4346679431578a31c178b5a8dacb9a2997da6638ad29785e9b8bf2e7cb961"} Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.020693 4744 scope.go:117] "RemoveContainer" containerID="0b84edf3841095c52db7070f12d64d4d9fc3a5f7b53b2635d69a14c81ec7bc95" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.028776 4744 generic.go:334] "Generic (PLEG): container finished" podID="7c510803-2f8d-422b-b970-ebcc1217fab1" containerID="2b6c8e2595a2802f59dc32a59a5a227c97ff669d7b03cf3f1befc6f81bef5878" exitCode=143 Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.028829 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"7c510803-2f8d-422b-b970-ebcc1217fab1","Type":"ContainerDied","Data":"2b6c8e2595a2802f59dc32a59a5a227c97ff669d7b03cf3f1befc6f81bef5878"} Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.043659 4744 scope.go:117] "RemoveContainer" containerID="ca4c8dbede0240eb2d7ad4b4f643c6b8599b160c796125178bb936319707aef5" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.048152 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-d357-account-create-update-g66fz"] Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.059692 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-db-create-c2ltm"] Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.067813 4744 scope.go:117] "RemoveContainer" containerID="9399e321ccfe38a509f81c4a74b2cf2c3fe5d8186d60c863a8b702409056fd0e" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.070468 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-d357-account-create-update-g66fz"] Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.075611 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-db-create-c2ltm"] Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.103714 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66ab5b76-8301-4e9f-9aa7-b009652944e1" path="/var/lib/kubelet/pods/66ab5b76-8301-4e9f-9aa7-b009652944e1/volumes" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.104219 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08" path="/var/lib/kubelet/pods/adce1d89-2b99-4f7c-ba03-0fcc1bb8ea08/volumes" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.104780 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd5ce9df-fda1-446b-a224-9f4d1a93dc47" path="/var/lib/kubelet/pods/cd5ce9df-fda1-446b-a224-9f4d1a93dc47/volumes" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.117028 4744 scope.go:117] "RemoveContainer" containerID="3f7008c507d8d6388d4fb2a7c8a28bddac099704ad7f4a376285a36efe9d6ebc" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.145822 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e63252d-7f8f-4399-ae89-40706313b337-log-httpd\") pod \"5e63252d-7f8f-4399-ae89-40706313b337\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.145897 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-combined-ca-bundle\") pod \"5e63252d-7f8f-4399-ae89-40706313b337\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.145922 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-scripts\") pod \"5e63252d-7f8f-4399-ae89-40706313b337\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.145939 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mdpz\" (UniqueName: \"kubernetes.io/projected/5e63252d-7f8f-4399-ae89-40706313b337-kube-api-access-4mdpz\") pod \"5e63252d-7f8f-4399-ae89-40706313b337\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.145959 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-ceilometer-tls-certs\") pod \"5e63252d-7f8f-4399-ae89-40706313b337\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.145997 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-sg-core-conf-yaml\") pod \"5e63252d-7f8f-4399-ae89-40706313b337\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.146047 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e63252d-7f8f-4399-ae89-40706313b337-run-httpd\") pod \"5e63252d-7f8f-4399-ae89-40706313b337\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.146096 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-config-data\") pod \"5e63252d-7f8f-4399-ae89-40706313b337\" (UID: \"5e63252d-7f8f-4399-ae89-40706313b337\") " Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.147923 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e63252d-7f8f-4399-ae89-40706313b337-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5e63252d-7f8f-4399-ae89-40706313b337" (UID: "5e63252d-7f8f-4399-ae89-40706313b337"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.149582 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e63252d-7f8f-4399-ae89-40706313b337-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5e63252d-7f8f-4399-ae89-40706313b337" (UID: "5e63252d-7f8f-4399-ae89-40706313b337"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.154176 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-scripts" (OuterVolumeSpecName: "scripts") pod "5e63252d-7f8f-4399-ae89-40706313b337" (UID: "5e63252d-7f8f-4399-ae89-40706313b337"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.155052 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e63252d-7f8f-4399-ae89-40706313b337-kube-api-access-4mdpz" (OuterVolumeSpecName: "kube-api-access-4mdpz") pod "5e63252d-7f8f-4399-ae89-40706313b337" (UID: "5e63252d-7f8f-4399-ae89-40706313b337"). InnerVolumeSpecName "kube-api-access-4mdpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.198474 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5e63252d-7f8f-4399-ae89-40706313b337" (UID: "5e63252d-7f8f-4399-ae89-40706313b337"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.210792 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e63252d-7f8f-4399-ae89-40706313b337" (UID: "5e63252d-7f8f-4399-ae89-40706313b337"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.212495 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5e63252d-7f8f-4399-ae89-40706313b337" (UID: "5e63252d-7f8f-4399-ae89-40706313b337"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.252652 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e63252d-7f8f-4399-ae89-40706313b337-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.252681 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e63252d-7f8f-4399-ae89-40706313b337-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.252692 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.252702 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.252710 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mdpz\" (UniqueName: \"kubernetes.io/projected/5e63252d-7f8f-4399-ae89-40706313b337-kube-api-access-4mdpz\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.252718 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.252725 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.256572 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-config-data" (OuterVolumeSpecName: "config-data") pod "5e63252d-7f8f-4399-ae89-40706313b337" (UID: "5e63252d-7f8f-4399-ae89-40706313b337"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.300072 4744 scope.go:117] "RemoveContainer" containerID="0b84edf3841095c52db7070f12d64d4d9fc3a5f7b53b2635d69a14c81ec7bc95" Dec 05 20:41:34 crc kubenswrapper[4744]: E1205 20:41:34.300689 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b84edf3841095c52db7070f12d64d4d9fc3a5f7b53b2635d69a14c81ec7bc95\": container with ID starting with 0b84edf3841095c52db7070f12d64d4d9fc3a5f7b53b2635d69a14c81ec7bc95 not found: ID does not exist" containerID="0b84edf3841095c52db7070f12d64d4d9fc3a5f7b53b2635d69a14c81ec7bc95" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.300744 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b84edf3841095c52db7070f12d64d4d9fc3a5f7b53b2635d69a14c81ec7bc95"} err="failed to get container status \"0b84edf3841095c52db7070f12d64d4d9fc3a5f7b53b2635d69a14c81ec7bc95\": rpc error: code = NotFound desc = could not find container \"0b84edf3841095c52db7070f12d64d4d9fc3a5f7b53b2635d69a14c81ec7bc95\": container with ID starting with 0b84edf3841095c52db7070f12d64d4d9fc3a5f7b53b2635d69a14c81ec7bc95 not found: ID does not exist" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.300775 4744 scope.go:117] "RemoveContainer" containerID="ca4c8dbede0240eb2d7ad4b4f643c6b8599b160c796125178bb936319707aef5" Dec 05 20:41:34 crc kubenswrapper[4744]: E1205 20:41:34.301201 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca4c8dbede0240eb2d7ad4b4f643c6b8599b160c796125178bb936319707aef5\": container with ID starting with ca4c8dbede0240eb2d7ad4b4f643c6b8599b160c796125178bb936319707aef5 not found: ID does not exist" containerID="ca4c8dbede0240eb2d7ad4b4f643c6b8599b160c796125178bb936319707aef5" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.301255 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4c8dbede0240eb2d7ad4b4f643c6b8599b160c796125178bb936319707aef5"} err="failed to get container status \"ca4c8dbede0240eb2d7ad4b4f643c6b8599b160c796125178bb936319707aef5\": rpc error: code = NotFound desc = could not find container \"ca4c8dbede0240eb2d7ad4b4f643c6b8599b160c796125178bb936319707aef5\": container with ID starting with ca4c8dbede0240eb2d7ad4b4f643c6b8599b160c796125178bb936319707aef5 not found: ID does not exist" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.301285 4744 scope.go:117] "RemoveContainer" containerID="9399e321ccfe38a509f81c4a74b2cf2c3fe5d8186d60c863a8b702409056fd0e" Dec 05 20:41:34 crc kubenswrapper[4744]: E1205 20:41:34.301714 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9399e321ccfe38a509f81c4a74b2cf2c3fe5d8186d60c863a8b702409056fd0e\": container with ID starting with 9399e321ccfe38a509f81c4a74b2cf2c3fe5d8186d60c863a8b702409056fd0e not found: ID does not exist" containerID="9399e321ccfe38a509f81c4a74b2cf2c3fe5d8186d60c863a8b702409056fd0e" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.301770 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9399e321ccfe38a509f81c4a74b2cf2c3fe5d8186d60c863a8b702409056fd0e"} err="failed to get container status \"9399e321ccfe38a509f81c4a74b2cf2c3fe5d8186d60c863a8b702409056fd0e\": rpc error: code = NotFound desc = could not find container \"9399e321ccfe38a509f81c4a74b2cf2c3fe5d8186d60c863a8b702409056fd0e\": container with ID starting with 9399e321ccfe38a509f81c4a74b2cf2c3fe5d8186d60c863a8b702409056fd0e not found: ID does not exist" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.301817 4744 scope.go:117] "RemoveContainer" containerID="3f7008c507d8d6388d4fb2a7c8a28bddac099704ad7f4a376285a36efe9d6ebc" Dec 05 20:41:34 crc kubenswrapper[4744]: E1205 20:41:34.302233 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f7008c507d8d6388d4fb2a7c8a28bddac099704ad7f4a376285a36efe9d6ebc\": container with ID starting with 3f7008c507d8d6388d4fb2a7c8a28bddac099704ad7f4a376285a36efe9d6ebc not found: ID does not exist" containerID="3f7008c507d8d6388d4fb2a7c8a28bddac099704ad7f4a376285a36efe9d6ebc" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.302259 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f7008c507d8d6388d4fb2a7c8a28bddac099704ad7f4a376285a36efe9d6ebc"} err="failed to get container status \"3f7008c507d8d6388d4fb2a7c8a28bddac099704ad7f4a376285a36efe9d6ebc\": rpc error: code = NotFound desc = could not find container \"3f7008c507d8d6388d4fb2a7c8a28bddac099704ad7f4a376285a36efe9d6ebc\": container with ID starting with 3f7008c507d8d6388d4fb2a7c8a28bddac099704ad7f4a376285a36efe9d6ebc not found: ID does not exist" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.354488 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e63252d-7f8f-4399-ae89-40706313b337-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.394469 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.401081 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.425588 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:41:34 crc kubenswrapper[4744]: E1205 20:41:34.425970 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="sg-core" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.425994 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="sg-core" Dec 05 20:41:34 crc kubenswrapper[4744]: E1205 20:41:34.426021 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="proxy-httpd" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.426030 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="proxy-httpd" Dec 05 20:41:34 crc kubenswrapper[4744]: E1205 20:41:34.426045 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="ceilometer-central-agent" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.426053 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="ceilometer-central-agent" Dec 05 20:41:34 crc kubenswrapper[4744]: E1205 20:41:34.426079 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="ceilometer-notification-agent" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.426087 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="ceilometer-notification-agent" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.426568 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="proxy-httpd" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.426602 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="ceilometer-central-agent" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.426615 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="ceilometer-notification-agent" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.426625 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e63252d-7f8f-4399-ae89-40706313b337" containerName="sg-core" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.428217 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.430259 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.430824 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.430833 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.445549 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.576728 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.577071 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-scripts\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.577127 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.577207 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.577252 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e9eae43-bb48-4499-9958-665fe8fa9b02-run-httpd\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.577275 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnn4l\" (UniqueName: \"kubernetes.io/projected/4e9eae43-bb48-4499-9958-665fe8fa9b02-kube-api-access-nnn4l\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.577388 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-config-data\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.577520 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e9eae43-bb48-4499-9958-665fe8fa9b02-log-httpd\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.678968 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e9eae43-bb48-4499-9958-665fe8fa9b02-run-httpd\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.679017 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnn4l\" (UniqueName: \"kubernetes.io/projected/4e9eae43-bb48-4499-9958-665fe8fa9b02-kube-api-access-nnn4l\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.679042 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-config-data\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.679080 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e9eae43-bb48-4499-9958-665fe8fa9b02-log-httpd\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.679102 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.679144 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-scripts\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.679177 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.679211 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.680349 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e9eae43-bb48-4499-9958-665fe8fa9b02-log-httpd\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.680345 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e9eae43-bb48-4499-9958-665fe8fa9b02-run-httpd\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.683692 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-config-data\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.685199 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.689514 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-scripts\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.689635 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.690371 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.704786 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnn4l\" (UniqueName: \"kubernetes.io/projected/4e9eae43-bb48-4499-9958-665fe8fa9b02-kube-api-access-nnn4l\") pod \"ceilometer-0\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.795756 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.832698 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.983338 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-custom-prometheus-ca\") pod \"7c510803-2f8d-422b-b970-ebcc1217fab1\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.983401 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjndt\" (UniqueName: \"kubernetes.io/projected/7c510803-2f8d-422b-b970-ebcc1217fab1-kube-api-access-gjndt\") pod \"7c510803-2f8d-422b-b970-ebcc1217fab1\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.983456 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-combined-ca-bundle\") pod \"7c510803-2f8d-422b-b970-ebcc1217fab1\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.983553 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c510803-2f8d-422b-b970-ebcc1217fab1-logs\") pod \"7c510803-2f8d-422b-b970-ebcc1217fab1\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.983603 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-config-data\") pod \"7c510803-2f8d-422b-b970-ebcc1217fab1\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.983670 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-cert-memcached-mtls\") pod \"7c510803-2f8d-422b-b970-ebcc1217fab1\" (UID: \"7c510803-2f8d-422b-b970-ebcc1217fab1\") " Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.985848 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c510803-2f8d-422b-b970-ebcc1217fab1-logs" (OuterVolumeSpecName: "logs") pod "7c510803-2f8d-422b-b970-ebcc1217fab1" (UID: "7c510803-2f8d-422b-b970-ebcc1217fab1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:41:34 crc kubenswrapper[4744]: I1205 20:41:34.991469 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c510803-2f8d-422b-b970-ebcc1217fab1-kube-api-access-gjndt" (OuterVolumeSpecName: "kube-api-access-gjndt") pod "7c510803-2f8d-422b-b970-ebcc1217fab1" (UID: "7c510803-2f8d-422b-b970-ebcc1217fab1"). InnerVolumeSpecName "kube-api-access-gjndt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.008293 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "7c510803-2f8d-422b-b970-ebcc1217fab1" (UID: "7c510803-2f8d-422b-b970-ebcc1217fab1"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.034057 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c510803-2f8d-422b-b970-ebcc1217fab1" (UID: "7c510803-2f8d-422b-b970-ebcc1217fab1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.054817 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-config-data" (OuterVolumeSpecName: "config-data") pod "7c510803-2f8d-422b-b970-ebcc1217fab1" (UID: "7c510803-2f8d-422b-b970-ebcc1217fab1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.055027 4744 generic.go:334] "Generic (PLEG): container finished" podID="7c510803-2f8d-422b-b970-ebcc1217fab1" containerID="da071c12ccb3cf7a2b46363a5a49e8aa50a0e17eef45a9f9425b47876ffcc66e" exitCode=0 Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.055090 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"7c510803-2f8d-422b-b970-ebcc1217fab1","Type":"ContainerDied","Data":"da071c12ccb3cf7a2b46363a5a49e8aa50a0e17eef45a9f9425b47876ffcc66e"} Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.055121 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"7c510803-2f8d-422b-b970-ebcc1217fab1","Type":"ContainerDied","Data":"7e783f659b05b3bfe550cf63aa0f623e115b22b91a97efb8d557c7f20d316740"} Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.055141 4744 scope.go:117] "RemoveContainer" containerID="da071c12ccb3cf7a2b46363a5a49e8aa50a0e17eef45a9f9425b47876ffcc66e" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.055257 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.059150 4744 generic.go:334] "Generic (PLEG): container finished" podID="29562ca3-c750-42df-bc4d-8e7958280481" containerID="6e3a2a4acc6ff94bd082aaa2663dd3c2ae1b06ef6291b53db18ca789c6d7b93d" exitCode=0 Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.059184 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchercf3b-account-delete-vbtk2" event={"ID":"29562ca3-c750-42df-bc4d-8e7958280481","Type":"ContainerDied","Data":"6e3a2a4acc6ff94bd082aaa2663dd3c2ae1b06ef6291b53db18ca789c6d7b93d"} Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.068607 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "7c510803-2f8d-422b-b970-ebcc1217fab1" (UID: "7c510803-2f8d-422b-b970-ebcc1217fab1"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.083266 4744 scope.go:117] "RemoveContainer" containerID="2b6c8e2595a2802f59dc32a59a5a227c97ff669d7b03cf3f1befc6f81bef5878" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.085927 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.085947 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.085957 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjndt\" (UniqueName: \"kubernetes.io/projected/7c510803-2f8d-422b-b970-ebcc1217fab1-kube-api-access-gjndt\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.085966 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.085975 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c510803-2f8d-422b-b970-ebcc1217fab1-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.085983 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c510803-2f8d-422b-b970-ebcc1217fab1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.107058 4744 scope.go:117] "RemoveContainer" containerID="da071c12ccb3cf7a2b46363a5a49e8aa50a0e17eef45a9f9425b47876ffcc66e" Dec 05 20:41:35 crc kubenswrapper[4744]: E1205 20:41:35.107581 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da071c12ccb3cf7a2b46363a5a49e8aa50a0e17eef45a9f9425b47876ffcc66e\": container with ID starting with da071c12ccb3cf7a2b46363a5a49e8aa50a0e17eef45a9f9425b47876ffcc66e not found: ID does not exist" containerID="da071c12ccb3cf7a2b46363a5a49e8aa50a0e17eef45a9f9425b47876ffcc66e" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.107613 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da071c12ccb3cf7a2b46363a5a49e8aa50a0e17eef45a9f9425b47876ffcc66e"} err="failed to get container status \"da071c12ccb3cf7a2b46363a5a49e8aa50a0e17eef45a9f9425b47876ffcc66e\": rpc error: code = NotFound desc = could not find container \"da071c12ccb3cf7a2b46363a5a49e8aa50a0e17eef45a9f9425b47876ffcc66e\": container with ID starting with da071c12ccb3cf7a2b46363a5a49e8aa50a0e17eef45a9f9425b47876ffcc66e not found: ID does not exist" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.107631 4744 scope.go:117] "RemoveContainer" containerID="2b6c8e2595a2802f59dc32a59a5a227c97ff669d7b03cf3f1befc6f81bef5878" Dec 05 20:41:35 crc kubenswrapper[4744]: E1205 20:41:35.107996 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b6c8e2595a2802f59dc32a59a5a227c97ff669d7b03cf3f1befc6f81bef5878\": container with ID starting with 2b6c8e2595a2802f59dc32a59a5a227c97ff669d7b03cf3f1befc6f81bef5878 not found: ID does not exist" containerID="2b6c8e2595a2802f59dc32a59a5a227c97ff669d7b03cf3f1befc6f81bef5878" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.108023 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6c8e2595a2802f59dc32a59a5a227c97ff669d7b03cf3f1befc6f81bef5878"} err="failed to get container status \"2b6c8e2595a2802f59dc32a59a5a227c97ff669d7b03cf3f1befc6f81bef5878\": rpc error: code = NotFound desc = could not find container \"2b6c8e2595a2802f59dc32a59a5a227c97ff669d7b03cf3f1befc6f81bef5878\": container with ID starting with 2b6c8e2595a2802f59dc32a59a5a227c97ff669d7b03cf3f1befc6f81bef5878 not found: ID does not exist" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.311488 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:41:35 crc kubenswrapper[4744]: W1205 20:41:35.321635 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e9eae43_bb48_4499_9958_665fe8fa9b02.slice/crio-8d0ca335a410c1b208c2c1ff51a2cd0d1fb2de041a36019337db82b4c32b76d8 WatchSource:0}: Error finding container 8d0ca335a410c1b208c2c1ff51a2cd0d1fb2de041a36019337db82b4c32b76d8: Status 404 returned error can't find the container with id 8d0ca335a410c1b208c2c1ff51a2cd0d1fb2de041a36019337db82b4c32b76d8 Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.386713 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.394458 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:41:35 crc kubenswrapper[4744]: E1205 20:41:35.596378 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:41:35 crc kubenswrapper[4744]: E1205 20:41:35.601503 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:41:35 crc kubenswrapper[4744]: E1205 20:41:35.603546 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:41:35 crc kubenswrapper[4744]: E1205 20:41:35.603627 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="ee659ec0-fc2e-4720-bc81-416ad0498280" containerName="watcher-applier" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.743294 4744 scope.go:117] "RemoveContainer" containerID="eadbfc5ee2005ba399e4d0e0307660149410178262f88158328a83cae16e4612" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.780722 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.857227 4744 scope.go:117] "RemoveContainer" containerID="459e3d5c0d306564a3666de2a1828256b44e906766ae6fe1639db84c71df8b1b" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.891656 4744 scope.go:117] "RemoveContainer" containerID="9cf7325abf7b894ef7e81fd88034d3caacae387c0810a43d85cc18f0a1849d2b" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.914106 4744 scope.go:117] "RemoveContainer" containerID="0bf1083548f5d39ef55e8658a898e3970cc658bd2c5be4fb2dbaddf47052a6b7" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.931649 4744 scope.go:117] "RemoveContainer" containerID="907cd745d35bc32e4192938fd9d110d9799cac27617b634baafcb9e67964cddf" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.947449 4744 scope.go:117] "RemoveContainer" containerID="c68a8721639ff6780a5e47c3baff7e7748c1e124c1430194d0fd371c4946756a" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.963849 4744 scope.go:117] "RemoveContainer" containerID="7bd559647e30caa57c089caf9a11afcfb54f1242416956d673e2b1b849c775b1" Dec 05 20:41:35 crc kubenswrapper[4744]: I1205 20:41:35.985668 4744 scope.go:117] "RemoveContainer" containerID="d0875b5d39e0e847f374a2df32cb7bd08b1d2e9efb4d5afac7c9c80a330ccb83" Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.010366 4744 scope.go:117] "RemoveContainer" containerID="ad5bb2e5c902f8f334770fc8c38d4b6378ad921244c8f313bbc0747f18bd6fd5" Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.031985 4744 scope.go:117] "RemoveContainer" containerID="8cd93ac303e7672e0dd178a58a7e51b745bd68ebdfe10909dc99676f764a3aab" Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.049216 4744 scope.go:117] "RemoveContainer" containerID="c56c20d58f662ebca38e439b26d62ea3ef945a21c219b1b59b56db3ec172b337" Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.078281 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4e9eae43-bb48-4499-9958-665fe8fa9b02","Type":"ContainerStarted","Data":"8ff058fa71bd088663e597344b1be176f7194603fe6791175938a38e739d9ffa"} Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.078343 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4e9eae43-bb48-4499-9958-665fe8fa9b02","Type":"ContainerStarted","Data":"8d0ca335a410c1b208c2c1ff51a2cd0d1fb2de041a36019337db82b4c32b76d8"} Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.101045 4744 scope.go:117] "RemoveContainer" containerID="ca1b5f30478bd337aeaa00fb47fc8b774f1c74d136d2283d8e7ca9d1135f54ff" Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.104086 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e63252d-7f8f-4399-ae89-40706313b337" path="/var/lib/kubelet/pods/5e63252d-7f8f-4399-ae89-40706313b337/volumes" Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.105137 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c510803-2f8d-422b-b970-ebcc1217fab1" path="/var/lib/kubelet/pods/7c510803-2f8d-422b-b970-ebcc1217fab1/volumes" Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.153464 4744 scope.go:117] "RemoveContainer" containerID="9f51ded012562761ae7cc1414729222e1030c498d1ac491ef6f97d94f83e559c" Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.197721 4744 scope.go:117] "RemoveContainer" containerID="89c0f0f6d1336044c620f54079aa4b81b5eaa806807f8ceb58c77418c5afa7d5" Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.213715 4744 scope.go:117] "RemoveContainer" containerID="1ee8afc9cfb0dcd9fbc94f6649cd5d76c15e557d931027df25758c40c4e731cb" Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.520449 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchercf3b-account-delete-vbtk2" Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.612864 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29562ca3-c750-42df-bc4d-8e7958280481-operator-scripts\") pod \"29562ca3-c750-42df-bc4d-8e7958280481\" (UID: \"29562ca3-c750-42df-bc4d-8e7958280481\") " Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.612926 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh9tj\" (UniqueName: \"kubernetes.io/projected/29562ca3-c750-42df-bc4d-8e7958280481-kube-api-access-sh9tj\") pod \"29562ca3-c750-42df-bc4d-8e7958280481\" (UID: \"29562ca3-c750-42df-bc4d-8e7958280481\") " Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.613387 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29562ca3-c750-42df-bc4d-8e7958280481-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29562ca3-c750-42df-bc4d-8e7958280481" (UID: "29562ca3-c750-42df-bc4d-8e7958280481"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.623627 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29562ca3-c750-42df-bc4d-8e7958280481-kube-api-access-sh9tj" (OuterVolumeSpecName: "kube-api-access-sh9tj") pod "29562ca3-c750-42df-bc4d-8e7958280481" (UID: "29562ca3-c750-42df-bc4d-8e7958280481"). InnerVolumeSpecName "kube-api-access-sh9tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.719470 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29562ca3-c750-42df-bc4d-8e7958280481-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:36 crc kubenswrapper[4744]: I1205 20:41:36.719516 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh9tj\" (UniqueName: \"kubernetes.io/projected/29562ca3-c750-42df-bc4d-8e7958280481-kube-api-access-sh9tj\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.092639 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchercf3b-account-delete-vbtk2" Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.093652 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchercf3b-account-delete-vbtk2" event={"ID":"29562ca3-c750-42df-bc4d-8e7958280481","Type":"ContainerDied","Data":"68575be92b7bfbb8cd583e9449b29ace5e8c3e2f59792173917e487583cb4202"} Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.093692 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68575be92b7bfbb8cd583e9449b29ace5e8c3e2f59792173917e487583cb4202" Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.095560 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4e9eae43-bb48-4499-9958-665fe8fa9b02","Type":"ContainerStarted","Data":"aef905eae0ce7310a2fd774f80917cc74b426ecb473b1a84682a6e7d09c76735"} Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.729096 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.838788 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-config-data\") pod \"ee659ec0-fc2e-4720-bc81-416ad0498280\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.838911 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-cert-memcached-mtls\") pod \"ee659ec0-fc2e-4720-bc81-416ad0498280\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.838952 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-combined-ca-bundle\") pod \"ee659ec0-fc2e-4720-bc81-416ad0498280\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.839028 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqppd\" (UniqueName: \"kubernetes.io/projected/ee659ec0-fc2e-4720-bc81-416ad0498280-kube-api-access-nqppd\") pod \"ee659ec0-fc2e-4720-bc81-416ad0498280\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.839052 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee659ec0-fc2e-4720-bc81-416ad0498280-logs\") pod \"ee659ec0-fc2e-4720-bc81-416ad0498280\" (UID: \"ee659ec0-fc2e-4720-bc81-416ad0498280\") " Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.839603 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee659ec0-fc2e-4720-bc81-416ad0498280-logs" (OuterVolumeSpecName: "logs") pod "ee659ec0-fc2e-4720-bc81-416ad0498280" (UID: "ee659ec0-fc2e-4720-bc81-416ad0498280"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.853449 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee659ec0-fc2e-4720-bc81-416ad0498280-kube-api-access-nqppd" (OuterVolumeSpecName: "kube-api-access-nqppd") pod "ee659ec0-fc2e-4720-bc81-416ad0498280" (UID: "ee659ec0-fc2e-4720-bc81-416ad0498280"). InnerVolumeSpecName "kube-api-access-nqppd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.869406 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee659ec0-fc2e-4720-bc81-416ad0498280" (UID: "ee659ec0-fc2e-4720-bc81-416ad0498280"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.883191 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-config-data" (OuterVolumeSpecName: "config-data") pod "ee659ec0-fc2e-4720-bc81-416ad0498280" (UID: "ee659ec0-fc2e-4720-bc81-416ad0498280"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.914985 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "ee659ec0-fc2e-4720-bc81-416ad0498280" (UID: "ee659ec0-fc2e-4720-bc81-416ad0498280"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.941368 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.941402 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.941412 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqppd\" (UniqueName: \"kubernetes.io/projected/ee659ec0-fc2e-4720-bc81-416ad0498280-kube-api-access-nqppd\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.941423 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee659ec0-fc2e-4720-bc81-416ad0498280-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:37 crc kubenswrapper[4744]: I1205 20:41:37.941431 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee659ec0-fc2e-4720-bc81-416ad0498280-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.078483 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-fp9n8"] Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.091473 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-fp9n8"] Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.119749 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watchercf3b-account-delete-vbtk2"] Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.123888 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4e9eae43-bb48-4499-9958-665fe8fa9b02","Type":"ContainerStarted","Data":"1768282a465e7e7b04dafc44a3f0452d1f0a684143b225c0356f6fc00066d54e"} Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.127036 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c"] Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.130071 4744 generic.go:334] "Generic (PLEG): container finished" podID="ee659ec0-fc2e-4720-bc81-416ad0498280" containerID="be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254" exitCode=0 Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.130122 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"ee659ec0-fc2e-4720-bc81-416ad0498280","Type":"ContainerDied","Data":"be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254"} Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.130162 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"ee659ec0-fc2e-4720-bc81-416ad0498280","Type":"ContainerDied","Data":"ffd116d8de1466771b7d739da917eb1a1a55eeacb8eb08756cd3083f8e46c1c1"} Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.130184 4744 scope.go:117] "RemoveContainer" containerID="be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.130394 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.137190 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watchercf3b-account-delete-vbtk2"] Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.145826 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-cf3b-account-create-update-44q7c"] Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.194161 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-4mwtz"] Dec 05 20:41:38 crc kubenswrapper[4744]: E1205 20:41:38.194545 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c510803-2f8d-422b-b970-ebcc1217fab1" containerName="watcher-kuttl-api-log" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.194561 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c510803-2f8d-422b-b970-ebcc1217fab1" containerName="watcher-kuttl-api-log" Dec 05 20:41:38 crc kubenswrapper[4744]: E1205 20:41:38.194579 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee659ec0-fc2e-4720-bc81-416ad0498280" containerName="watcher-applier" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.194587 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee659ec0-fc2e-4720-bc81-416ad0498280" containerName="watcher-applier" Dec 05 20:41:38 crc kubenswrapper[4744]: E1205 20:41:38.194594 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c510803-2f8d-422b-b970-ebcc1217fab1" containerName="watcher-api" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.194600 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c510803-2f8d-422b-b970-ebcc1217fab1" containerName="watcher-api" Dec 05 20:41:38 crc kubenswrapper[4744]: E1205 20:41:38.194633 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29562ca3-c750-42df-bc4d-8e7958280481" containerName="mariadb-account-delete" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.194639 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="29562ca3-c750-42df-bc4d-8e7958280481" containerName="mariadb-account-delete" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.194789 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c510803-2f8d-422b-b970-ebcc1217fab1" containerName="watcher-kuttl-api-log" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.194804 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee659ec0-fc2e-4720-bc81-416ad0498280" containerName="watcher-applier" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.194819 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c510803-2f8d-422b-b970-ebcc1217fab1" containerName="watcher-api" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.194833 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="29562ca3-c750-42df-bc4d-8e7958280481" containerName="mariadb-account-delete" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.195382 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-4mwtz" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.214362 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-4mwtz"] Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.226952 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.238759 4744 scope.go:117] "RemoveContainer" containerID="be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254" Dec 05 20:41:38 crc kubenswrapper[4744]: E1205 20:41:38.239148 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254\": container with ID starting with be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254 not found: ID does not exist" containerID="be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.239170 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254"} err="failed to get container status \"be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254\": rpc error: code = NotFound desc = could not find container \"be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254\": container with ID starting with be5a0d37e2794a1fbedc0d9c68fa6a90e2f868de6bcaead89705ca1814ec8254 not found: ID does not exist" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.263397 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.284175 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-update-vmm54"] Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.285226 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-update-vmm54" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.293157 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.304508 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-update-vmm54"] Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.366160 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/215b84b6-8bdb-4102-9ef1-80ef9f6a538e-operator-scripts\") pod \"watcher-db-create-4mwtz\" (UID: \"215b84b6-8bdb-4102-9ef1-80ef9f6a538e\") " pod="watcher-kuttl-default/watcher-db-create-4mwtz" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.366219 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdql7\" (UniqueName: \"kubernetes.io/projected/215b84b6-8bdb-4102-9ef1-80ef9f6a538e-kube-api-access-qdql7\") pod \"watcher-db-create-4mwtz\" (UID: \"215b84b6-8bdb-4102-9ef1-80ef9f6a538e\") " pod="watcher-kuttl-default/watcher-db-create-4mwtz" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.467465 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f26fa528-ef83-4870-ba3d-ae08e92b47b9-operator-scripts\") pod \"watcher-test-account-create-update-vmm54\" (UID: \"f26fa528-ef83-4870-ba3d-ae08e92b47b9\") " pod="watcher-kuttl-default/watcher-test-account-create-update-vmm54" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.467527 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/215b84b6-8bdb-4102-9ef1-80ef9f6a538e-operator-scripts\") pod \"watcher-db-create-4mwtz\" (UID: \"215b84b6-8bdb-4102-9ef1-80ef9f6a538e\") " pod="watcher-kuttl-default/watcher-db-create-4mwtz" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.467563 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdql7\" (UniqueName: \"kubernetes.io/projected/215b84b6-8bdb-4102-9ef1-80ef9f6a538e-kube-api-access-qdql7\") pod \"watcher-db-create-4mwtz\" (UID: \"215b84b6-8bdb-4102-9ef1-80ef9f6a538e\") " pod="watcher-kuttl-default/watcher-db-create-4mwtz" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.467626 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfmqk\" (UniqueName: \"kubernetes.io/projected/f26fa528-ef83-4870-ba3d-ae08e92b47b9-kube-api-access-nfmqk\") pod \"watcher-test-account-create-update-vmm54\" (UID: \"f26fa528-ef83-4870-ba3d-ae08e92b47b9\") " pod="watcher-kuttl-default/watcher-test-account-create-update-vmm54" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.468269 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/215b84b6-8bdb-4102-9ef1-80ef9f6a538e-operator-scripts\") pod \"watcher-db-create-4mwtz\" (UID: \"215b84b6-8bdb-4102-9ef1-80ef9f6a538e\") " pod="watcher-kuttl-default/watcher-db-create-4mwtz" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.483048 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdql7\" (UniqueName: \"kubernetes.io/projected/215b84b6-8bdb-4102-9ef1-80ef9f6a538e-kube-api-access-qdql7\") pod \"watcher-db-create-4mwtz\" (UID: \"215b84b6-8bdb-4102-9ef1-80ef9f6a538e\") " pod="watcher-kuttl-default/watcher-db-create-4mwtz" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.530286 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-4mwtz" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.532620 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.568602 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfmqk\" (UniqueName: \"kubernetes.io/projected/f26fa528-ef83-4870-ba3d-ae08e92b47b9-kube-api-access-nfmqk\") pod \"watcher-test-account-create-update-vmm54\" (UID: \"f26fa528-ef83-4870-ba3d-ae08e92b47b9\") " pod="watcher-kuttl-default/watcher-test-account-create-update-vmm54" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.568664 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f26fa528-ef83-4870-ba3d-ae08e92b47b9-operator-scripts\") pod \"watcher-test-account-create-update-vmm54\" (UID: \"f26fa528-ef83-4870-ba3d-ae08e92b47b9\") " pod="watcher-kuttl-default/watcher-test-account-create-update-vmm54" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.569241 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f26fa528-ef83-4870-ba3d-ae08e92b47b9-operator-scripts\") pod \"watcher-test-account-create-update-vmm54\" (UID: \"f26fa528-ef83-4870-ba3d-ae08e92b47b9\") " pod="watcher-kuttl-default/watcher-test-account-create-update-vmm54" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.596717 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfmqk\" (UniqueName: \"kubernetes.io/projected/f26fa528-ef83-4870-ba3d-ae08e92b47b9-kube-api-access-nfmqk\") pod \"watcher-test-account-create-update-vmm54\" (UID: \"f26fa528-ef83-4870-ba3d-ae08e92b47b9\") " pod="watcher-kuttl-default/watcher-test-account-create-update-vmm54" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.616346 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-update-vmm54" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.669441 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-custom-prometheus-ca\") pod \"1183185b-c3f3-47ed-b168-408c077efcbb\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.669767 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-config-data\") pod \"1183185b-c3f3-47ed-b168-408c077efcbb\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.669804 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1183185b-c3f3-47ed-b168-408c077efcbb-logs\") pod \"1183185b-c3f3-47ed-b168-408c077efcbb\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.669844 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-combined-ca-bundle\") pod \"1183185b-c3f3-47ed-b168-408c077efcbb\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.669897 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-cert-memcached-mtls\") pod \"1183185b-c3f3-47ed-b168-408c077efcbb\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.669946 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47s4x\" (UniqueName: \"kubernetes.io/projected/1183185b-c3f3-47ed-b168-408c077efcbb-kube-api-access-47s4x\") pod \"1183185b-c3f3-47ed-b168-408c077efcbb\" (UID: \"1183185b-c3f3-47ed-b168-408c077efcbb\") " Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.670166 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1183185b-c3f3-47ed-b168-408c077efcbb-logs" (OuterVolumeSpecName: "logs") pod "1183185b-c3f3-47ed-b168-408c077efcbb" (UID: "1183185b-c3f3-47ed-b168-408c077efcbb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.670637 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1183185b-c3f3-47ed-b168-408c077efcbb-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.675182 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1183185b-c3f3-47ed-b168-408c077efcbb-kube-api-access-47s4x" (OuterVolumeSpecName: "kube-api-access-47s4x") pod "1183185b-c3f3-47ed-b168-408c077efcbb" (UID: "1183185b-c3f3-47ed-b168-408c077efcbb"). InnerVolumeSpecName "kube-api-access-47s4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.721961 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1183185b-c3f3-47ed-b168-408c077efcbb" (UID: "1183185b-c3f3-47ed-b168-408c077efcbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.725746 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "1183185b-c3f3-47ed-b168-408c077efcbb" (UID: "1183185b-c3f3-47ed-b168-408c077efcbb"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.751452 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "1183185b-c3f3-47ed-b168-408c077efcbb" (UID: "1183185b-c3f3-47ed-b168-408c077efcbb"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.769680 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-config-data" (OuterVolumeSpecName: "config-data") pod "1183185b-c3f3-47ed-b168-408c077efcbb" (UID: "1183185b-c3f3-47ed-b168-408c077efcbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.772139 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.772163 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.772173 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47s4x\" (UniqueName: \"kubernetes.io/projected/1183185b-c3f3-47ed-b168-408c077efcbb-kube-api-access-47s4x\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.772183 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.772192 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1183185b-c3f3-47ed-b168-408c077efcbb-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:38 crc kubenswrapper[4744]: I1205 20:41:38.976242 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-4mwtz"] Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.150890 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-4mwtz" event={"ID":"215b84b6-8bdb-4102-9ef1-80ef9f6a538e","Type":"ContainerStarted","Data":"22fe04d8e9951a38dee857d78bc63b762f2d6b2ee95f244dbf84c452bce5bcf6"} Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.152814 4744 generic.go:334] "Generic (PLEG): container finished" podID="1183185b-c3f3-47ed-b168-408c077efcbb" containerID="f1ab19e4cf29a766ed8a86313dd78b4c4202f0e9b136c9707e3744294e001663" exitCode=0 Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.152874 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"1183185b-c3f3-47ed-b168-408c077efcbb","Type":"ContainerDied","Data":"f1ab19e4cf29a766ed8a86313dd78b4c4202f0e9b136c9707e3744294e001663"} Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.152903 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"1183185b-c3f3-47ed-b168-408c077efcbb","Type":"ContainerDied","Data":"d81d3fe70cb87998fba8a3634360e4c5045d3b4735d12c548c5f1431aa8799b2"} Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.152920 4744 scope.go:117] "RemoveContainer" containerID="f1ab19e4cf29a766ed8a86313dd78b4c4202f0e9b136c9707e3744294e001663" Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.153005 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.164664 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-update-vmm54"] Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.182372 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4e9eae43-bb48-4499-9958-665fe8fa9b02","Type":"ContainerStarted","Data":"a18212a6c4a2c0adcda6121abd3ea94808ab16c28cb66b05a432543b5016bfa2"} Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.182553 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="ceilometer-central-agent" containerID="cri-o://8ff058fa71bd088663e597344b1be176f7194603fe6791175938a38e739d9ffa" gracePeriod=30 Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.182789 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.182986 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="proxy-httpd" containerID="cri-o://a18212a6c4a2c0adcda6121abd3ea94808ab16c28cb66b05a432543b5016bfa2" gracePeriod=30 Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.183041 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="sg-core" containerID="cri-o://1768282a465e7e7b04dafc44a3f0452d1f0a684143b225c0356f6fc00066d54e" gracePeriod=30 Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.183072 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="ceilometer-notification-agent" containerID="cri-o://aef905eae0ce7310a2fd774f80917cc74b426ecb473b1a84682a6e7d09c76735" gracePeriod=30 Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.215875 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.7338566499999999 podStartE2EDuration="5.215856848s" podCreationTimestamp="2025-12-05 20:41:34 +0000 UTC" firstStartedPulling="2025-12-05 20:41:35.32408597 +0000 UTC m=+1865.553897338" lastFinishedPulling="2025-12-05 20:41:38.806086168 +0000 UTC m=+1869.035897536" observedRunningTime="2025-12-05 20:41:39.208479798 +0000 UTC m=+1869.438291186" watchObservedRunningTime="2025-12-05 20:41:39.215856848 +0000 UTC m=+1869.445668216" Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.246537 4744 scope.go:117] "RemoveContainer" containerID="f1ab19e4cf29a766ed8a86313dd78b4c4202f0e9b136c9707e3744294e001663" Dec 05 20:41:39 crc kubenswrapper[4744]: E1205 20:41:39.247611 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1ab19e4cf29a766ed8a86313dd78b4c4202f0e9b136c9707e3744294e001663\": container with ID starting with f1ab19e4cf29a766ed8a86313dd78b4c4202f0e9b136c9707e3744294e001663 not found: ID does not exist" containerID="f1ab19e4cf29a766ed8a86313dd78b4c4202f0e9b136c9707e3744294e001663" Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.247646 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1ab19e4cf29a766ed8a86313dd78b4c4202f0e9b136c9707e3744294e001663"} err="failed to get container status \"f1ab19e4cf29a766ed8a86313dd78b4c4202f0e9b136c9707e3744294e001663\": rpc error: code = NotFound desc = could not find container \"f1ab19e4cf29a766ed8a86313dd78b4c4202f0e9b136c9707e3744294e001663\": container with ID starting with f1ab19e4cf29a766ed8a86313dd78b4c4202f0e9b136c9707e3744294e001663 not found: ID does not exist" Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.300485 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:41:39 crc kubenswrapper[4744]: I1205 20:41:39.308396 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.089910 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1183185b-c3f3-47ed-b168-408c077efcbb" path="/var/lib/kubelet/pods/1183185b-c3f3-47ed-b168-408c077efcbb/volumes" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.090952 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29562ca3-c750-42df-bc4d-8e7958280481" path="/var/lib/kubelet/pods/29562ca3-c750-42df-bc4d-8e7958280481/volumes" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.091485 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ee60fc-4c24-4634-ac99-a46bb500f280" path="/var/lib/kubelet/pods/33ee60fc-4c24-4634-ac99-a46bb500f280/volumes" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.092480 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed7ecb82-da56-4634-91db-8dbe745cb6f7" path="/var/lib/kubelet/pods/ed7ecb82-da56-4634-91db-8dbe745cb6f7/volumes" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.093044 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee659ec0-fc2e-4720-bc81-416ad0498280" path="/var/lib/kubelet/pods/ee659ec0-fc2e-4720-bc81-416ad0498280/volumes" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.200977 4744 generic.go:334] "Generic (PLEG): container finished" podID="215b84b6-8bdb-4102-9ef1-80ef9f6a538e" containerID="128094aea8fa20b96b19fe855c1d1f1c226683188f571a036f4c0baf84ecee45" exitCode=0 Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.201042 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-4mwtz" event={"ID":"215b84b6-8bdb-4102-9ef1-80ef9f6a538e","Type":"ContainerDied","Data":"128094aea8fa20b96b19fe855c1d1f1c226683188f571a036f4c0baf84ecee45"} Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.205231 4744 generic.go:334] "Generic (PLEG): container finished" podID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerID="a18212a6c4a2c0adcda6121abd3ea94808ab16c28cb66b05a432543b5016bfa2" exitCode=0 Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.205324 4744 generic.go:334] "Generic (PLEG): container finished" podID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerID="1768282a465e7e7b04dafc44a3f0452d1f0a684143b225c0356f6fc00066d54e" exitCode=2 Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.205340 4744 generic.go:334] "Generic (PLEG): container finished" podID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerID="aef905eae0ce7310a2fd774f80917cc74b426ecb473b1a84682a6e7d09c76735" exitCode=0 Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.205398 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4e9eae43-bb48-4499-9958-665fe8fa9b02","Type":"ContainerDied","Data":"a18212a6c4a2c0adcda6121abd3ea94808ab16c28cb66b05a432543b5016bfa2"} Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.205430 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4e9eae43-bb48-4499-9958-665fe8fa9b02","Type":"ContainerDied","Data":"1768282a465e7e7b04dafc44a3f0452d1f0a684143b225c0356f6fc00066d54e"} Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.205449 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4e9eae43-bb48-4499-9958-665fe8fa9b02","Type":"ContainerDied","Data":"aef905eae0ce7310a2fd774f80917cc74b426ecb473b1a84682a6e7d09c76735"} Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.206671 4744 generic.go:334] "Generic (PLEG): container finished" podID="f26fa528-ef83-4870-ba3d-ae08e92b47b9" containerID="ca968dd8c18f20e765a8e2b95db30864932ab1e30c4b8227429b63a9ad992989" exitCode=0 Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.206688 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-test-account-create-update-vmm54" event={"ID":"f26fa528-ef83-4870-ba3d-ae08e92b47b9","Type":"ContainerDied","Data":"ca968dd8c18f20e765a8e2b95db30864932ab1e30c4b8227429b63a9ad992989"} Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.206700 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-test-account-create-update-vmm54" event={"ID":"f26fa528-ef83-4870-ba3d-ae08e92b47b9","Type":"ContainerStarted","Data":"7fe907d4f99e5b24faa72af294bb5562e04ddddb80d86a5818ff0f29c28b0efd"} Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.642914 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.803169 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-combined-ca-bundle\") pod \"4e9eae43-bb48-4499-9958-665fe8fa9b02\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.803222 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-sg-core-conf-yaml\") pod \"4e9eae43-bb48-4499-9958-665fe8fa9b02\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.803341 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-config-data\") pod \"4e9eae43-bb48-4499-9958-665fe8fa9b02\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.803362 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-scripts\") pod \"4e9eae43-bb48-4499-9958-665fe8fa9b02\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.803411 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnn4l\" (UniqueName: \"kubernetes.io/projected/4e9eae43-bb48-4499-9958-665fe8fa9b02-kube-api-access-nnn4l\") pod \"4e9eae43-bb48-4499-9958-665fe8fa9b02\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.803432 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e9eae43-bb48-4499-9958-665fe8fa9b02-log-httpd\") pod \"4e9eae43-bb48-4499-9958-665fe8fa9b02\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.803527 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e9eae43-bb48-4499-9958-665fe8fa9b02-run-httpd\") pod \"4e9eae43-bb48-4499-9958-665fe8fa9b02\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.803559 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-ceilometer-tls-certs\") pod \"4e9eae43-bb48-4499-9958-665fe8fa9b02\" (UID: \"4e9eae43-bb48-4499-9958-665fe8fa9b02\") " Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.804544 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e9eae43-bb48-4499-9958-665fe8fa9b02-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4e9eae43-bb48-4499-9958-665fe8fa9b02" (UID: "4e9eae43-bb48-4499-9958-665fe8fa9b02"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.804850 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e9eae43-bb48-4499-9958-665fe8fa9b02-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4e9eae43-bb48-4499-9958-665fe8fa9b02" (UID: "4e9eae43-bb48-4499-9958-665fe8fa9b02"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.810554 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e9eae43-bb48-4499-9958-665fe8fa9b02-kube-api-access-nnn4l" (OuterVolumeSpecName: "kube-api-access-nnn4l") pod "4e9eae43-bb48-4499-9958-665fe8fa9b02" (UID: "4e9eae43-bb48-4499-9958-665fe8fa9b02"). InnerVolumeSpecName "kube-api-access-nnn4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.812396 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-scripts" (OuterVolumeSpecName: "scripts") pod "4e9eae43-bb48-4499-9958-665fe8fa9b02" (UID: "4e9eae43-bb48-4499-9958-665fe8fa9b02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.838432 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4e9eae43-bb48-4499-9958-665fe8fa9b02" (UID: "4e9eae43-bb48-4499-9958-665fe8fa9b02"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.852857 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4e9eae43-bb48-4499-9958-665fe8fa9b02" (UID: "4e9eae43-bb48-4499-9958-665fe8fa9b02"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.868116 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e9eae43-bb48-4499-9958-665fe8fa9b02" (UID: "4e9eae43-bb48-4499-9958-665fe8fa9b02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.884935 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-config-data" (OuterVolumeSpecName: "config-data") pod "4e9eae43-bb48-4499-9958-665fe8fa9b02" (UID: "4e9eae43-bb48-4499-9958-665fe8fa9b02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.905067 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.905118 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.905138 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnn4l\" (UniqueName: \"kubernetes.io/projected/4e9eae43-bb48-4499-9958-665fe8fa9b02-kube-api-access-nnn4l\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.905157 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e9eae43-bb48-4499-9958-665fe8fa9b02-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.905174 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e9eae43-bb48-4499-9958-665fe8fa9b02-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.905192 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.905214 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:40 crc kubenswrapper[4744]: I1205 20:41:40.905240 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e9eae43-bb48-4499-9958-665fe8fa9b02-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.080725 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:41:41 crc kubenswrapper[4744]: E1205 20:41:41.081170 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.217720 4744 generic.go:334] "Generic (PLEG): container finished" podID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerID="8ff058fa71bd088663e597344b1be176f7194603fe6791175938a38e739d9ffa" exitCode=0 Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.217777 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.217860 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4e9eae43-bb48-4499-9958-665fe8fa9b02","Type":"ContainerDied","Data":"8ff058fa71bd088663e597344b1be176f7194603fe6791175938a38e739d9ffa"} Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.217941 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4e9eae43-bb48-4499-9958-665fe8fa9b02","Type":"ContainerDied","Data":"8d0ca335a410c1b208c2c1ff51a2cd0d1fb2de041a36019337db82b4c32b76d8"} Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.217986 4744 scope.go:117] "RemoveContainer" containerID="a18212a6c4a2c0adcda6121abd3ea94808ab16c28cb66b05a432543b5016bfa2" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.286062 4744 scope.go:117] "RemoveContainer" containerID="1768282a465e7e7b04dafc44a3f0452d1f0a684143b225c0356f6fc00066d54e" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.292567 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.301188 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.313100 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:41:41 crc kubenswrapper[4744]: E1205 20:41:41.313413 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="sg-core" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.313430 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="sg-core" Dec 05 20:41:41 crc kubenswrapper[4744]: E1205 20:41:41.313446 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="proxy-httpd" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.313452 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="proxy-httpd" Dec 05 20:41:41 crc kubenswrapper[4744]: E1205 20:41:41.313462 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="ceilometer-notification-agent" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.313468 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="ceilometer-notification-agent" Dec 05 20:41:41 crc kubenswrapper[4744]: E1205 20:41:41.313478 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1183185b-c3f3-47ed-b168-408c077efcbb" containerName="watcher-decision-engine" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.313483 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1183185b-c3f3-47ed-b168-408c077efcbb" containerName="watcher-decision-engine" Dec 05 20:41:41 crc kubenswrapper[4744]: E1205 20:41:41.313495 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="ceilometer-central-agent" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.313500 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="ceilometer-central-agent" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.313646 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="proxy-httpd" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.313655 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="ceilometer-central-agent" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.313667 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1183185b-c3f3-47ed-b168-408c077efcbb" containerName="watcher-decision-engine" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.313677 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="sg-core" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.313689 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" containerName="ceilometer-notification-agent" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.314946 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.315025 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.326088 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.326098 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.326327 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.357046 4744 scope.go:117] "RemoveContainer" containerID="aef905eae0ce7310a2fd774f80917cc74b426ecb473b1a84682a6e7d09c76735" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.378662 4744 scope.go:117] "RemoveContainer" containerID="8ff058fa71bd088663e597344b1be176f7194603fe6791175938a38e739d9ffa" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.417281 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.417476 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.417566 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc557\" (UniqueName: \"kubernetes.io/projected/f8701e09-ae1a-465a-a8c7-2249c53e372e-kube-api-access-bc557\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.417659 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-scripts\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.417683 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8701e09-ae1a-465a-a8c7-2249c53e372e-log-httpd\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.417791 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8701e09-ae1a-465a-a8c7-2249c53e372e-run-httpd\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.417850 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-config-data\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.417899 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.424173 4744 scope.go:117] "RemoveContainer" containerID="a18212a6c4a2c0adcda6121abd3ea94808ab16c28cb66b05a432543b5016bfa2" Dec 05 20:41:41 crc kubenswrapper[4744]: E1205 20:41:41.424952 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a18212a6c4a2c0adcda6121abd3ea94808ab16c28cb66b05a432543b5016bfa2\": container with ID starting with a18212a6c4a2c0adcda6121abd3ea94808ab16c28cb66b05a432543b5016bfa2 not found: ID does not exist" containerID="a18212a6c4a2c0adcda6121abd3ea94808ab16c28cb66b05a432543b5016bfa2" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.425041 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a18212a6c4a2c0adcda6121abd3ea94808ab16c28cb66b05a432543b5016bfa2"} err="failed to get container status \"a18212a6c4a2c0adcda6121abd3ea94808ab16c28cb66b05a432543b5016bfa2\": rpc error: code = NotFound desc = could not find container \"a18212a6c4a2c0adcda6121abd3ea94808ab16c28cb66b05a432543b5016bfa2\": container with ID starting with a18212a6c4a2c0adcda6121abd3ea94808ab16c28cb66b05a432543b5016bfa2 not found: ID does not exist" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.425079 4744 scope.go:117] "RemoveContainer" containerID="1768282a465e7e7b04dafc44a3f0452d1f0a684143b225c0356f6fc00066d54e" Dec 05 20:41:41 crc kubenswrapper[4744]: E1205 20:41:41.425522 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1768282a465e7e7b04dafc44a3f0452d1f0a684143b225c0356f6fc00066d54e\": container with ID starting with 1768282a465e7e7b04dafc44a3f0452d1f0a684143b225c0356f6fc00066d54e not found: ID does not exist" containerID="1768282a465e7e7b04dafc44a3f0452d1f0a684143b225c0356f6fc00066d54e" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.425565 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1768282a465e7e7b04dafc44a3f0452d1f0a684143b225c0356f6fc00066d54e"} err="failed to get container status \"1768282a465e7e7b04dafc44a3f0452d1f0a684143b225c0356f6fc00066d54e\": rpc error: code = NotFound desc = could not find container \"1768282a465e7e7b04dafc44a3f0452d1f0a684143b225c0356f6fc00066d54e\": container with ID starting with 1768282a465e7e7b04dafc44a3f0452d1f0a684143b225c0356f6fc00066d54e not found: ID does not exist" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.425601 4744 scope.go:117] "RemoveContainer" containerID="aef905eae0ce7310a2fd774f80917cc74b426ecb473b1a84682a6e7d09c76735" Dec 05 20:41:41 crc kubenswrapper[4744]: E1205 20:41:41.426334 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aef905eae0ce7310a2fd774f80917cc74b426ecb473b1a84682a6e7d09c76735\": container with ID starting with aef905eae0ce7310a2fd774f80917cc74b426ecb473b1a84682a6e7d09c76735 not found: ID does not exist" containerID="aef905eae0ce7310a2fd774f80917cc74b426ecb473b1a84682a6e7d09c76735" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.426361 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef905eae0ce7310a2fd774f80917cc74b426ecb473b1a84682a6e7d09c76735"} err="failed to get container status \"aef905eae0ce7310a2fd774f80917cc74b426ecb473b1a84682a6e7d09c76735\": rpc error: code = NotFound desc = could not find container \"aef905eae0ce7310a2fd774f80917cc74b426ecb473b1a84682a6e7d09c76735\": container with ID starting with aef905eae0ce7310a2fd774f80917cc74b426ecb473b1a84682a6e7d09c76735 not found: ID does not exist" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.426384 4744 scope.go:117] "RemoveContainer" containerID="8ff058fa71bd088663e597344b1be176f7194603fe6791175938a38e739d9ffa" Dec 05 20:41:41 crc kubenswrapper[4744]: E1205 20:41:41.427166 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff058fa71bd088663e597344b1be176f7194603fe6791175938a38e739d9ffa\": container with ID starting with 8ff058fa71bd088663e597344b1be176f7194603fe6791175938a38e739d9ffa not found: ID does not exist" containerID="8ff058fa71bd088663e597344b1be176f7194603fe6791175938a38e739d9ffa" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.427202 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff058fa71bd088663e597344b1be176f7194603fe6791175938a38e739d9ffa"} err="failed to get container status \"8ff058fa71bd088663e597344b1be176f7194603fe6791175938a38e739d9ffa\": rpc error: code = NotFound desc = could not find container \"8ff058fa71bd088663e597344b1be176f7194603fe6791175938a38e739d9ffa\": container with ID starting with 8ff058fa71bd088663e597344b1be176f7194603fe6791175938a38e739d9ffa not found: ID does not exist" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.519078 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8701e09-ae1a-465a-a8c7-2249c53e372e-run-httpd\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.519154 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-config-data\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.519206 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.519252 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.519319 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.519366 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc557\" (UniqueName: \"kubernetes.io/projected/f8701e09-ae1a-465a-a8c7-2249c53e372e-kube-api-access-bc557\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.519395 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-scripts\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.519418 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8701e09-ae1a-465a-a8c7-2249c53e372e-log-httpd\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.519928 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8701e09-ae1a-465a-a8c7-2249c53e372e-log-httpd\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.520200 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8701e09-ae1a-465a-a8c7-2249c53e372e-run-httpd\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.526759 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.530273 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-config-data\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.530272 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-scripts\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.538052 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.540361 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc557\" (UniqueName: \"kubernetes.io/projected/f8701e09-ae1a-465a-a8c7-2249c53e372e-kube-api-access-bc557\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.546272 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.605679 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-update-vmm54" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.658445 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.722852 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f26fa528-ef83-4870-ba3d-ae08e92b47b9-operator-scripts\") pod \"f26fa528-ef83-4870-ba3d-ae08e92b47b9\" (UID: \"f26fa528-ef83-4870-ba3d-ae08e92b47b9\") " Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.723237 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfmqk\" (UniqueName: \"kubernetes.io/projected/f26fa528-ef83-4870-ba3d-ae08e92b47b9-kube-api-access-nfmqk\") pod \"f26fa528-ef83-4870-ba3d-ae08e92b47b9\" (UID: \"f26fa528-ef83-4870-ba3d-ae08e92b47b9\") " Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.724641 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f26fa528-ef83-4870-ba3d-ae08e92b47b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f26fa528-ef83-4870-ba3d-ae08e92b47b9" (UID: "f26fa528-ef83-4870-ba3d-ae08e92b47b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.726268 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26fa528-ef83-4870-ba3d-ae08e92b47b9-kube-api-access-nfmqk" (OuterVolumeSpecName: "kube-api-access-nfmqk") pod "f26fa528-ef83-4870-ba3d-ae08e92b47b9" (UID: "f26fa528-ef83-4870-ba3d-ae08e92b47b9"). InnerVolumeSpecName "kube-api-access-nfmqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.737264 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-4mwtz" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.825149 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f26fa528-ef83-4870-ba3d-ae08e92b47b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.825183 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfmqk\" (UniqueName: \"kubernetes.io/projected/f26fa528-ef83-4870-ba3d-ae08e92b47b9-kube-api-access-nfmqk\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.926185 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/215b84b6-8bdb-4102-9ef1-80ef9f6a538e-operator-scripts\") pod \"215b84b6-8bdb-4102-9ef1-80ef9f6a538e\" (UID: \"215b84b6-8bdb-4102-9ef1-80ef9f6a538e\") " Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.926334 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdql7\" (UniqueName: \"kubernetes.io/projected/215b84b6-8bdb-4102-9ef1-80ef9f6a538e-kube-api-access-qdql7\") pod \"215b84b6-8bdb-4102-9ef1-80ef9f6a538e\" (UID: \"215b84b6-8bdb-4102-9ef1-80ef9f6a538e\") " Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.926837 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215b84b6-8bdb-4102-9ef1-80ef9f6a538e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "215b84b6-8bdb-4102-9ef1-80ef9f6a538e" (UID: "215b84b6-8bdb-4102-9ef1-80ef9f6a538e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:41:41 crc kubenswrapper[4744]: I1205 20:41:41.929938 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215b84b6-8bdb-4102-9ef1-80ef9f6a538e-kube-api-access-qdql7" (OuterVolumeSpecName: "kube-api-access-qdql7") pod "215b84b6-8bdb-4102-9ef1-80ef9f6a538e" (UID: "215b84b6-8bdb-4102-9ef1-80ef9f6a538e"). InnerVolumeSpecName "kube-api-access-qdql7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:41:42 crc kubenswrapper[4744]: I1205 20:41:42.028630 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdql7\" (UniqueName: \"kubernetes.io/projected/215b84b6-8bdb-4102-9ef1-80ef9f6a538e-kube-api-access-qdql7\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:42 crc kubenswrapper[4744]: I1205 20:41:42.028663 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/215b84b6-8bdb-4102-9ef1-80ef9f6a538e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:42 crc kubenswrapper[4744]: I1205 20:41:42.096879 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e9eae43-bb48-4499-9958-665fe8fa9b02" path="/var/lib/kubelet/pods/4e9eae43-bb48-4499-9958-665fe8fa9b02/volumes" Dec 05 20:41:42 crc kubenswrapper[4744]: I1205 20:41:42.133940 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:41:42 crc kubenswrapper[4744]: I1205 20:41:42.228005 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-test-account-create-update-vmm54" event={"ID":"f26fa528-ef83-4870-ba3d-ae08e92b47b9","Type":"ContainerDied","Data":"7fe907d4f99e5b24faa72af294bb5562e04ddddb80d86a5818ff0f29c28b0efd"} Dec 05 20:41:42 crc kubenswrapper[4744]: I1205 20:41:42.228049 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fe907d4f99e5b24faa72af294bb5562e04ddddb80d86a5818ff0f29c28b0efd" Dec 05 20:41:42 crc kubenswrapper[4744]: I1205 20:41:42.228102 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-update-vmm54" Dec 05 20:41:42 crc kubenswrapper[4744]: I1205 20:41:42.231623 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f8701e09-ae1a-465a-a8c7-2249c53e372e","Type":"ContainerStarted","Data":"887b2b69d046197e4e5ce8b9b8988c0e32ffbb5ef5d43ff9cbbedc0d7e54d756"} Dec 05 20:41:42 crc kubenswrapper[4744]: I1205 20:41:42.233644 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-4mwtz" event={"ID":"215b84b6-8bdb-4102-9ef1-80ef9f6a538e","Type":"ContainerDied","Data":"22fe04d8e9951a38dee857d78bc63b762f2d6b2ee95f244dbf84c452bce5bcf6"} Dec 05 20:41:42 crc kubenswrapper[4744]: I1205 20:41:42.233680 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22fe04d8e9951a38dee857d78bc63b762f2d6b2ee95f244dbf84c452bce5bcf6" Dec 05 20:41:42 crc kubenswrapper[4744]: I1205 20:41:42.233703 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-4mwtz" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.244101 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f8701e09-ae1a-465a-a8c7-2249c53e372e","Type":"ContainerStarted","Data":"5c4a9ce4561c70839437800e1391ba03e9bb294b9ddafbcae3e14b3e9a851efa"} Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.600808 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-69895"] Dec 05 20:41:43 crc kubenswrapper[4744]: E1205 20:41:43.601117 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="215b84b6-8bdb-4102-9ef1-80ef9f6a538e" containerName="mariadb-database-create" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.601133 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="215b84b6-8bdb-4102-9ef1-80ef9f6a538e" containerName="mariadb-database-create" Dec 05 20:41:43 crc kubenswrapper[4744]: E1205 20:41:43.601158 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26fa528-ef83-4870-ba3d-ae08e92b47b9" containerName="mariadb-account-create-update" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.601164 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26fa528-ef83-4870-ba3d-ae08e92b47b9" containerName="mariadb-account-create-update" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.601312 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="215b84b6-8bdb-4102-9ef1-80ef9f6a538e" containerName="mariadb-database-create" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.601335 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26fa528-ef83-4870-ba3d-ae08e92b47b9" containerName="mariadb-account-create-update" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.601837 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.606864 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-xw6mx" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.608671 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.618047 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-69895"] Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.755279 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-db-sync-config-data\") pod \"watcher-kuttl-db-sync-69895\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.755470 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4rsd\" (UniqueName: \"kubernetes.io/projected/5c098edd-eac1-4078-831b-efc5572e94ce-kube-api-access-f4rsd\") pod \"watcher-kuttl-db-sync-69895\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.755602 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-config-data\") pod \"watcher-kuttl-db-sync-69895\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.755665 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-69895\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.857244 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-db-sync-config-data\") pod \"watcher-kuttl-db-sync-69895\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.857370 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4rsd\" (UniqueName: \"kubernetes.io/projected/5c098edd-eac1-4078-831b-efc5572e94ce-kube-api-access-f4rsd\") pod \"watcher-kuttl-db-sync-69895\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.857406 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-config-data\") pod \"watcher-kuttl-db-sync-69895\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.857435 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-69895\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.862635 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-config-data\") pod \"watcher-kuttl-db-sync-69895\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.865804 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-db-sync-config-data\") pod \"watcher-kuttl-db-sync-69895\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.869129 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-69895\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.877876 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4rsd\" (UniqueName: \"kubernetes.io/projected/5c098edd-eac1-4078-831b-efc5572e94ce-kube-api-access-f4rsd\") pod \"watcher-kuttl-db-sync-69895\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:43 crc kubenswrapper[4744]: I1205 20:41:43.914860 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:44 crc kubenswrapper[4744]: I1205 20:41:44.253477 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f8701e09-ae1a-465a-a8c7-2249c53e372e","Type":"ContainerStarted","Data":"ccb528b61e4927fba7180d07b978f1677a204a6ba5a04d615d32e62a838ba605"} Dec 05 20:41:44 crc kubenswrapper[4744]: I1205 20:41:44.254101 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f8701e09-ae1a-465a-a8c7-2249c53e372e","Type":"ContainerStarted","Data":"f4abe7b92f2a7814cc9038c28d88b8fb685aa7f210eb3a21ae0ce8321aa6af83"} Dec 05 20:41:44 crc kubenswrapper[4744]: W1205 20:41:44.425725 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c098edd_eac1_4078_831b_efc5572e94ce.slice/crio-ada3e157be6c17dedd730cd27fcc16496e97d4c906bd1c69b9dcab8903ed86fb WatchSource:0}: Error finding container ada3e157be6c17dedd730cd27fcc16496e97d4c906bd1c69b9dcab8903ed86fb: Status 404 returned error can't find the container with id ada3e157be6c17dedd730cd27fcc16496e97d4c906bd1c69b9dcab8903ed86fb Dec 05 20:41:44 crc kubenswrapper[4744]: I1205 20:41:44.430182 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-69895"] Dec 05 20:41:45 crc kubenswrapper[4744]: I1205 20:41:45.264963 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" event={"ID":"5c098edd-eac1-4078-831b-efc5572e94ce","Type":"ContainerStarted","Data":"7bc978c9bd198c9a2b39b98805aad2f918174954b5fbbd71fff48bb3d4491596"} Dec 05 20:41:45 crc kubenswrapper[4744]: I1205 20:41:45.265261 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" event={"ID":"5c098edd-eac1-4078-831b-efc5572e94ce","Type":"ContainerStarted","Data":"ada3e157be6c17dedd730cd27fcc16496e97d4c906bd1c69b9dcab8903ed86fb"} Dec 05 20:41:45 crc kubenswrapper[4744]: I1205 20:41:45.281648 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" podStartSLOduration=2.2816289530000002 podStartE2EDuration="2.281628953s" podCreationTimestamp="2025-12-05 20:41:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:41:45.278943748 +0000 UTC m=+1875.508755136" watchObservedRunningTime="2025-12-05 20:41:45.281628953 +0000 UTC m=+1875.511440341" Dec 05 20:41:46 crc kubenswrapper[4744]: I1205 20:41:46.275076 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f8701e09-ae1a-465a-a8c7-2249c53e372e","Type":"ContainerStarted","Data":"72cc857daaf97271fbb3685c30567841c9fdee329faafb63de0393161163158a"} Dec 05 20:41:46 crc kubenswrapper[4744]: I1205 20:41:46.300188 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.256489642 podStartE2EDuration="5.300171484s" podCreationTimestamp="2025-12-05 20:41:41 +0000 UTC" firstStartedPulling="2025-12-05 20:41:42.133718305 +0000 UTC m=+1872.363529693" lastFinishedPulling="2025-12-05 20:41:45.177400157 +0000 UTC m=+1875.407211535" observedRunningTime="2025-12-05 20:41:46.293438109 +0000 UTC m=+1876.523249477" watchObservedRunningTime="2025-12-05 20:41:46.300171484 +0000 UTC m=+1876.529982852" Dec 05 20:41:47 crc kubenswrapper[4744]: I1205 20:41:47.283433 4744 generic.go:334] "Generic (PLEG): container finished" podID="5c098edd-eac1-4078-831b-efc5572e94ce" containerID="7bc978c9bd198c9a2b39b98805aad2f918174954b5fbbd71fff48bb3d4491596" exitCode=0 Dec 05 20:41:47 crc kubenswrapper[4744]: I1205 20:41:47.283507 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" event={"ID":"5c098edd-eac1-4078-831b-efc5572e94ce","Type":"ContainerDied","Data":"7bc978c9bd198c9a2b39b98805aad2f918174954b5fbbd71fff48bb3d4491596"} Dec 05 20:41:47 crc kubenswrapper[4744]: I1205 20:41:47.283686 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:41:48 crc kubenswrapper[4744]: I1205 20:41:48.714741 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:48 crc kubenswrapper[4744]: I1205 20:41:48.761964 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-config-data\") pod \"5c098edd-eac1-4078-831b-efc5572e94ce\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " Dec 05 20:41:48 crc kubenswrapper[4744]: I1205 20:41:48.762097 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-combined-ca-bundle\") pod \"5c098edd-eac1-4078-831b-efc5572e94ce\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " Dec 05 20:41:48 crc kubenswrapper[4744]: I1205 20:41:48.762127 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-db-sync-config-data\") pod \"5c098edd-eac1-4078-831b-efc5572e94ce\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " Dec 05 20:41:48 crc kubenswrapper[4744]: I1205 20:41:48.762168 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4rsd\" (UniqueName: \"kubernetes.io/projected/5c098edd-eac1-4078-831b-efc5572e94ce-kube-api-access-f4rsd\") pod \"5c098edd-eac1-4078-831b-efc5572e94ce\" (UID: \"5c098edd-eac1-4078-831b-efc5572e94ce\") " Dec 05 20:41:48 crc kubenswrapper[4744]: I1205 20:41:48.768005 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c098edd-eac1-4078-831b-efc5572e94ce-kube-api-access-f4rsd" (OuterVolumeSpecName: "kube-api-access-f4rsd") pod "5c098edd-eac1-4078-831b-efc5572e94ce" (UID: "5c098edd-eac1-4078-831b-efc5572e94ce"). InnerVolumeSpecName "kube-api-access-f4rsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:41:48 crc kubenswrapper[4744]: I1205 20:41:48.767774 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5c098edd-eac1-4078-831b-efc5572e94ce" (UID: "5c098edd-eac1-4078-831b-efc5572e94ce"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:48 crc kubenswrapper[4744]: I1205 20:41:48.790763 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c098edd-eac1-4078-831b-efc5572e94ce" (UID: "5c098edd-eac1-4078-831b-efc5572e94ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:48 crc kubenswrapper[4744]: I1205 20:41:48.821850 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-config-data" (OuterVolumeSpecName: "config-data") pod "5c098edd-eac1-4078-831b-efc5572e94ce" (UID: "5c098edd-eac1-4078-831b-efc5572e94ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:41:48 crc kubenswrapper[4744]: I1205 20:41:48.863767 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4rsd\" (UniqueName: \"kubernetes.io/projected/5c098edd-eac1-4078-831b-efc5572e94ce-kube-api-access-f4rsd\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:48 crc kubenswrapper[4744]: I1205 20:41:48.863812 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:48 crc kubenswrapper[4744]: I1205 20:41:48.863824 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:48 crc kubenswrapper[4744]: I1205 20:41:48.863832 4744 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c098edd-eac1-4078-831b-efc5572e94ce-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.314559 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" event={"ID":"5c098edd-eac1-4078-831b-efc5572e94ce","Type":"ContainerDied","Data":"ada3e157be6c17dedd730cd27fcc16496e97d4c906bd1c69b9dcab8903ed86fb"} Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.314599 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ada3e157be6c17dedd730cd27fcc16496e97d4c906bd1c69b9dcab8903ed86fb" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.314626 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-69895" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.611265 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:41:49 crc kubenswrapper[4744]: E1205 20:41:49.611737 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c098edd-eac1-4078-831b-efc5572e94ce" containerName="watcher-kuttl-db-sync" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.611762 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c098edd-eac1-4078-831b-efc5572e94ce" containerName="watcher-kuttl-db-sync" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.611980 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c098edd-eac1-4078-831b-efc5572e94ce" containerName="watcher-kuttl-db-sync" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.613134 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.618388 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-xw6mx" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.618412 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.626707 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.627710 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.631242 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.644480 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.645812 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.664432 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.681538 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.690902 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.729165 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.730327 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.732524 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.738343 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.780152 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.780194 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.780213 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxxlm\" (UniqueName: \"kubernetes.io/projected/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-kube-api-access-hxxlm\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.780236 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.780837 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bac364f1-17fe-4d5e-9ce8-bc07cb076890-logs\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.780914 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nb25\" (UniqueName: \"kubernetes.io/projected/6ea37095-c0eb-4f83-b06a-561b77d1846a-kube-api-access-5nb25\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.780957 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.781058 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.781099 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.781122 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea37095-c0eb-4f83-b06a-561b77d1846a-logs\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.781212 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq545\" (UniqueName: \"kubernetes.io/projected/bac364f1-17fe-4d5e-9ce8-bc07cb076890-kube-api-access-zq545\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.781251 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.781384 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.781465 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.781484 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.781540 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.781583 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.781652 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.882656 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.882718 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq545\" (UniqueName: \"kubernetes.io/projected/bac364f1-17fe-4d5e-9ce8-bc07cb076890-kube-api-access-zq545\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.882746 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.882890 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.883016 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.883055 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.883078 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.883145 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.883191 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.883201 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.883280 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.883352 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.883390 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.883414 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxlm\" (UniqueName: \"kubernetes.io/projected/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-kube-api-access-hxxlm\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.883475 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.883497 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.883744 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bac364f1-17fe-4d5e-9ce8-bc07cb076890-logs\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.884179 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bac364f1-17fe-4d5e-9ce8-bc07cb076890-logs\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.884272 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nb25\" (UniqueName: \"kubernetes.io/projected/6ea37095-c0eb-4f83-b06a-561b77d1846a-kube-api-access-5nb25\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.884354 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.884403 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.884435 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44ff514e-253b-4ec3-b370-35e2aa9f6103-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.884464 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.884485 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea37095-c0eb-4f83-b06a-561b77d1846a-logs\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.884526 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ff69\" (UniqueName: \"kubernetes.io/projected/44ff514e-253b-4ec3-b370-35e2aa9f6103-kube-api-access-5ff69\") pod \"watcher-kuttl-applier-0\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.888473 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.888550 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.888751 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea37095-c0eb-4f83-b06a-561b77d1846a-logs\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.888794 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.889252 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.889684 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.890103 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.891339 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.891351 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.892543 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.892894 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.892982 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.897909 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.909728 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq545\" (UniqueName: \"kubernetes.io/projected/bac364f1-17fe-4d5e-9ce8-bc07cb076890-kube-api-access-zq545\") pod \"watcher-kuttl-api-1\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.913350 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxxlm\" (UniqueName: \"kubernetes.io/projected/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-kube-api-access-hxxlm\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.915597 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nb25\" (UniqueName: \"kubernetes.io/projected/6ea37095-c0eb-4f83-b06a-561b77d1846a-kube-api-access-5nb25\") pod \"watcher-kuttl-api-0\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.930772 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.954604 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.965343 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.987167 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.987283 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.987346 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44ff514e-253b-4ec3-b370-35e2aa9f6103-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.987378 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ff69\" (UniqueName: \"kubernetes.io/projected/44ff514e-253b-4ec3-b370-35e2aa9f6103-kube-api-access-5ff69\") pod \"watcher-kuttl-applier-0\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.987405 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.988045 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44ff514e-253b-4ec3-b370-35e2aa9f6103-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.996356 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.996398 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:49 crc kubenswrapper[4744]: I1205 20:41:49.996865 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:50 crc kubenswrapper[4744]: I1205 20:41:50.005604 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ff69\" (UniqueName: \"kubernetes.io/projected/44ff514e-253b-4ec3-b370-35e2aa9f6103-kube-api-access-5ff69\") pod \"watcher-kuttl-applier-0\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:50 crc kubenswrapper[4744]: I1205 20:41:50.045689 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:50 crc kubenswrapper[4744]: I1205 20:41:50.430394 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:41:50 crc kubenswrapper[4744]: I1205 20:41:50.462557 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:41:50 crc kubenswrapper[4744]: W1205 20:41:50.469382 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ea37095_c0eb_4f83_b06a_561b77d1846a.slice/crio-b08c8b49479f28beb1cea60e48e4bad1526b3f045d8f935dc40400cd95a09ba7 WatchSource:0}: Error finding container b08c8b49479f28beb1cea60e48e4bad1526b3f045d8f935dc40400cd95a09ba7: Status 404 returned error can't find the container with id b08c8b49479f28beb1cea60e48e4bad1526b3f045d8f935dc40400cd95a09ba7 Dec 05 20:41:50 crc kubenswrapper[4744]: I1205 20:41:50.698156 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 20:41:50 crc kubenswrapper[4744]: I1205 20:41:50.711646 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:41:50 crc kubenswrapper[4744]: W1205 20:41:50.712543 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44ff514e_253b_4ec3_b370_35e2aa9f6103.slice/crio-e22016c983aa1d35b9e9e3dc15cbc7e9417030b776d9f667fecdbb1196c3137f WatchSource:0}: Error finding container e22016c983aa1d35b9e9e3dc15cbc7e9417030b776d9f667fecdbb1196c3137f: Status 404 returned error can't find the container with id e22016c983aa1d35b9e9e3dc15cbc7e9417030b776d9f667fecdbb1196c3137f Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.339364 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"74a97696-88ce-4bcc-9ae0-7bf972ecc08b","Type":"ContainerStarted","Data":"00e03da8880b706f8c3da6ef89466962352849f35418fc358ebdd82635328d0e"} Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.339417 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"74a97696-88ce-4bcc-9ae0-7bf972ecc08b","Type":"ContainerStarted","Data":"eae3d4c33b08a362cf30f49ef24c8b6ce6f4ba7a510042e59c3831429d426a1c"} Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.342763 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6ea37095-c0eb-4f83-b06a-561b77d1846a","Type":"ContainerStarted","Data":"eb8551d493d67d6cb5fef4ab90d717d3ffcaeaaa626865ab93e657e9e3eab833"} Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.342804 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6ea37095-c0eb-4f83-b06a-561b77d1846a","Type":"ContainerStarted","Data":"72d371ac878e4a5548c80ad3ac8f606838e140a4c9ec401bd7cc43882122bc05"} Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.342819 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6ea37095-c0eb-4f83-b06a-561b77d1846a","Type":"ContainerStarted","Data":"b08c8b49479f28beb1cea60e48e4bad1526b3f045d8f935dc40400cd95a09ba7"} Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.343790 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.345991 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"bac364f1-17fe-4d5e-9ce8-bc07cb076890","Type":"ContainerStarted","Data":"237d57e31131b2906c9fbcd0d8c4216b16561a8959845da422a59ab5600f304f"} Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.346027 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"bac364f1-17fe-4d5e-9ce8-bc07cb076890","Type":"ContainerStarted","Data":"c9fd73de858fee090acd7c7512d24e927654dd268bbe71d0ba60338eb8ad5a00"} Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.346036 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"bac364f1-17fe-4d5e-9ce8-bc07cb076890","Type":"ContainerStarted","Data":"cf1b90d2252000bcce90e62cb2097cec60aa27dc7f1075c0cc96820aaadbf867"} Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.346361 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.348705 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"44ff514e-253b-4ec3-b370-35e2aa9f6103","Type":"ContainerStarted","Data":"3816463ed5dd50dded6eba67277d9b0830527dca65e1d577dd07fea17c85ccc4"} Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.348761 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"44ff514e-253b-4ec3-b370-35e2aa9f6103","Type":"ContainerStarted","Data":"e22016c983aa1d35b9e9e3dc15cbc7e9417030b776d9f667fecdbb1196c3137f"} Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.368809 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.36877346 podStartE2EDuration="2.36877346s" podCreationTimestamp="2025-12-05 20:41:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:41:51.361635106 +0000 UTC m=+1881.591446484" watchObservedRunningTime="2025-12-05 20:41:51.36877346 +0000 UTC m=+1881.598584828" Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.392625 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.392591872 podStartE2EDuration="2.392591872s" podCreationTimestamp="2025-12-05 20:41:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:41:51.38228147 +0000 UTC m=+1881.612092838" watchObservedRunningTime="2025-12-05 20:41:51.392591872 +0000 UTC m=+1881.622403250" Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.402505 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-1" podStartSLOduration=2.402490404 podStartE2EDuration="2.402490404s" podCreationTimestamp="2025-12-05 20:41:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:41:51.397886571 +0000 UTC m=+1881.627697949" watchObservedRunningTime="2025-12-05 20:41:51.402490404 +0000 UTC m=+1881.632301772" Dec 05 20:41:51 crc kubenswrapper[4744]: I1205 20:41:51.418059 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.418040913 podStartE2EDuration="2.418040913s" podCreationTimestamp="2025-12-05 20:41:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:41:51.410857398 +0000 UTC m=+1881.640668766" watchObservedRunningTime="2025-12-05 20:41:51.418040913 +0000 UTC m=+1881.647852281" Dec 05 20:41:52 crc kubenswrapper[4744]: I1205 20:41:52.081486 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:41:52 crc kubenswrapper[4744]: E1205 20:41:52.082099 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:41:53 crc kubenswrapper[4744]: I1205 20:41:53.325794 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:53 crc kubenswrapper[4744]: I1205 20:41:53.386647 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:54 crc kubenswrapper[4744]: I1205 20:41:54.931972 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:54 crc kubenswrapper[4744]: I1205 20:41:54.966474 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:55 crc kubenswrapper[4744]: I1205 20:41:55.047405 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:41:59 crc kubenswrapper[4744]: I1205 20:41:59.931319 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:59 crc kubenswrapper[4744]: I1205 20:41:59.936610 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:41:59 crc kubenswrapper[4744]: I1205 20:41:59.955831 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:41:59 crc kubenswrapper[4744]: I1205 20:41:59.966627 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:59 crc kubenswrapper[4744]: I1205 20:41:59.974157 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:41:59 crc kubenswrapper[4744]: I1205 20:41:59.984363 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.047380 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.098645 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.136921 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k"] Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.143900 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.155998 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k"] Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.156811 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-scripts" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.161561 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.179823 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2jsh\" (UniqueName: \"kubernetes.io/projected/721f12fa-46f1-4f0d-a57b-7f8463a83c77-kube-api-access-p2jsh\") pod \"watcher-kuttl-db-purge-29416122-j7s6k\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.179911 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-combined-ca-bundle\") pod \"watcher-kuttl-db-purge-29416122-j7s6k\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.180040 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-scripts-volume\") pod \"watcher-kuttl-db-purge-29416122-j7s6k\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.180106 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-config-data\") pod \"watcher-kuttl-db-purge-29416122-j7s6k\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.281391 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-combined-ca-bundle\") pod \"watcher-kuttl-db-purge-29416122-j7s6k\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.281440 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-scripts-volume\") pod \"watcher-kuttl-db-purge-29416122-j7s6k\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.281473 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-config-data\") pod \"watcher-kuttl-db-purge-29416122-j7s6k\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.281544 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2jsh\" (UniqueName: \"kubernetes.io/projected/721f12fa-46f1-4f0d-a57b-7f8463a83c77-kube-api-access-p2jsh\") pod \"watcher-kuttl-db-purge-29416122-j7s6k\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.286889 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-config-data\") pod \"watcher-kuttl-db-purge-29416122-j7s6k\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.289765 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-scripts-volume\") pod \"watcher-kuttl-db-purge-29416122-j7s6k\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.290185 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-combined-ca-bundle\") pod \"watcher-kuttl-db-purge-29416122-j7s6k\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.311337 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2jsh\" (UniqueName: \"kubernetes.io/projected/721f12fa-46f1-4f0d-a57b-7f8463a83c77-kube-api-access-p2jsh\") pod \"watcher-kuttl-db-purge-29416122-j7s6k\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.453811 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.460577 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.464063 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.484960 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.514157 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:42:00 crc kubenswrapper[4744]: I1205 20:42:00.536064 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:42:01 crc kubenswrapper[4744]: I1205 20:42:01.041463 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k"] Dec 05 20:42:01 crc kubenswrapper[4744]: I1205 20:42:01.463747 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" event={"ID":"721f12fa-46f1-4f0d-a57b-7f8463a83c77","Type":"ContainerStarted","Data":"29e9e868e49fe0e6de7e312d86270b80a88b7c3bc533f798955d0ede1d0f5a82"} Dec 05 20:42:01 crc kubenswrapper[4744]: I1205 20:42:01.463783 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" event={"ID":"721f12fa-46f1-4f0d-a57b-7f8463a83c77","Type":"ContainerStarted","Data":"ac19cabfedd23fd64e50fdb8c01ad63ba20af6b577a0cd7058e761853118e7cd"} Dec 05 20:42:02 crc kubenswrapper[4744]: I1205 20:42:02.624183 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" podStartSLOduration=2.624167418 podStartE2EDuration="2.624167418s" podCreationTimestamp="2025-12-05 20:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:42:01.495713742 +0000 UTC m=+1891.725525110" watchObservedRunningTime="2025-12-05 20:42:02.624167418 +0000 UTC m=+1892.853978786" Dec 05 20:42:02 crc kubenswrapper[4744]: I1205 20:42:02.630978 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:42:02 crc kubenswrapper[4744]: I1205 20:42:02.631513 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="proxy-httpd" containerID="cri-o://72cc857daaf97271fbb3685c30567841c9fdee329faafb63de0393161163158a" gracePeriod=30 Dec 05 20:42:02 crc kubenswrapper[4744]: I1205 20:42:02.631581 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="ceilometer-notification-agent" containerID="cri-o://f4abe7b92f2a7814cc9038c28d88b8fb685aa7f210eb3a21ae0ce8321aa6af83" gracePeriod=30 Dec 05 20:42:02 crc kubenswrapper[4744]: I1205 20:42:02.631606 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="sg-core" containerID="cri-o://ccb528b61e4927fba7180d07b978f1677a204a6ba5a04d615d32e62a838ba605" gracePeriod=30 Dec 05 20:42:02 crc kubenswrapper[4744]: I1205 20:42:02.633153 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="ceilometer-central-agent" containerID="cri-o://5c4a9ce4561c70839437800e1391ba03e9bb294b9ddafbcae3e14b3e9a851efa" gracePeriod=30 Dec 05 20:42:02 crc kubenswrapper[4744]: I1205 20:42:02.653548 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 05 20:42:03 crc kubenswrapper[4744]: I1205 20:42:03.480611 4744 generic.go:334] "Generic (PLEG): container finished" podID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerID="72cc857daaf97271fbb3685c30567841c9fdee329faafb63de0393161163158a" exitCode=0 Dec 05 20:42:03 crc kubenswrapper[4744]: I1205 20:42:03.480851 4744 generic.go:334] "Generic (PLEG): container finished" podID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerID="ccb528b61e4927fba7180d07b978f1677a204a6ba5a04d615d32e62a838ba605" exitCode=2 Dec 05 20:42:03 crc kubenswrapper[4744]: I1205 20:42:03.480860 4744 generic.go:334] "Generic (PLEG): container finished" podID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerID="5c4a9ce4561c70839437800e1391ba03e9bb294b9ddafbcae3e14b3e9a851efa" exitCode=0 Dec 05 20:42:03 crc kubenswrapper[4744]: I1205 20:42:03.480672 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f8701e09-ae1a-465a-a8c7-2249c53e372e","Type":"ContainerDied","Data":"72cc857daaf97271fbb3685c30567841c9fdee329faafb63de0393161163158a"} Dec 05 20:42:03 crc kubenswrapper[4744]: I1205 20:42:03.480897 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f8701e09-ae1a-465a-a8c7-2249c53e372e","Type":"ContainerDied","Data":"ccb528b61e4927fba7180d07b978f1677a204a6ba5a04d615d32e62a838ba605"} Dec 05 20:42:03 crc kubenswrapper[4744]: I1205 20:42:03.480910 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f8701e09-ae1a-465a-a8c7-2249c53e372e","Type":"ContainerDied","Data":"5c4a9ce4561c70839437800e1391ba03e9bb294b9ddafbcae3e14b3e9a851efa"} Dec 05 20:42:04 crc kubenswrapper[4744]: I1205 20:42:04.492785 4744 generic.go:334] "Generic (PLEG): container finished" podID="721f12fa-46f1-4f0d-a57b-7f8463a83c77" containerID="29e9e868e49fe0e6de7e312d86270b80a88b7c3bc533f798955d0ede1d0f5a82" exitCode=0 Dec 05 20:42:04 crc kubenswrapper[4744]: I1205 20:42:04.492828 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" event={"ID":"721f12fa-46f1-4f0d-a57b-7f8463a83c77","Type":"ContainerDied","Data":"29e9e868e49fe0e6de7e312d86270b80a88b7c3bc533f798955d0ede1d0f5a82"} Dec 05 20:42:05 crc kubenswrapper[4744]: I1205 20:42:05.080522 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:42:05 crc kubenswrapper[4744]: E1205 20:42:05.080818 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:42:05 crc kubenswrapper[4744]: I1205 20:42:05.505065 4744 generic.go:334] "Generic (PLEG): container finished" podID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerID="f4abe7b92f2a7814cc9038c28d88b8fb685aa7f210eb3a21ae0ce8321aa6af83" exitCode=0 Dec 05 20:42:05 crc kubenswrapper[4744]: I1205 20:42:05.505149 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f8701e09-ae1a-465a-a8c7-2249c53e372e","Type":"ContainerDied","Data":"f4abe7b92f2a7814cc9038c28d88b8fb685aa7f210eb3a21ae0ce8321aa6af83"} Dec 05 20:42:05 crc kubenswrapper[4744]: I1205 20:42:05.926086 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:05 crc kubenswrapper[4744]: I1205 20:42:05.929329 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.094993 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8701e09-ae1a-465a-a8c7-2249c53e372e-run-httpd\") pod \"f8701e09-ae1a-465a-a8c7-2249c53e372e\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.095038 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-combined-ca-bundle\") pod \"f8701e09-ae1a-465a-a8c7-2249c53e372e\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.095098 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-combined-ca-bundle\") pod \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.095121 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc557\" (UniqueName: \"kubernetes.io/projected/f8701e09-ae1a-465a-a8c7-2249c53e372e-kube-api-access-bc557\") pod \"f8701e09-ae1a-465a-a8c7-2249c53e372e\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.095144 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-sg-core-conf-yaml\") pod \"f8701e09-ae1a-465a-a8c7-2249c53e372e\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.095219 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-config-data\") pod \"f8701e09-ae1a-465a-a8c7-2249c53e372e\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.095254 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8701e09-ae1a-465a-a8c7-2249c53e372e-log-httpd\") pod \"f8701e09-ae1a-465a-a8c7-2249c53e372e\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.095314 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-scripts\") pod \"f8701e09-ae1a-465a-a8c7-2249c53e372e\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.095547 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8701e09-ae1a-465a-a8c7-2249c53e372e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f8701e09-ae1a-465a-a8c7-2249c53e372e" (UID: "f8701e09-ae1a-465a-a8c7-2249c53e372e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.095987 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8701e09-ae1a-465a-a8c7-2249c53e372e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f8701e09-ae1a-465a-a8c7-2249c53e372e" (UID: "f8701e09-ae1a-465a-a8c7-2249c53e372e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.096169 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2jsh\" (UniqueName: \"kubernetes.io/projected/721f12fa-46f1-4f0d-a57b-7f8463a83c77-kube-api-access-p2jsh\") pod \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.097080 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-ceilometer-tls-certs\") pod \"f8701e09-ae1a-465a-a8c7-2249c53e372e\" (UID: \"f8701e09-ae1a-465a-a8c7-2249c53e372e\") " Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.097127 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-config-data\") pod \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.097187 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-scripts-volume\") pod \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\" (UID: \"721f12fa-46f1-4f0d-a57b-7f8463a83c77\") " Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.097918 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8701e09-ae1a-465a-a8c7-2249c53e372e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.097970 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8701e09-ae1a-465a-a8c7-2249c53e372e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.101755 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8701e09-ae1a-465a-a8c7-2249c53e372e-kube-api-access-bc557" (OuterVolumeSpecName: "kube-api-access-bc557") pod "f8701e09-ae1a-465a-a8c7-2249c53e372e" (UID: "f8701e09-ae1a-465a-a8c7-2249c53e372e"). InnerVolumeSpecName "kube-api-access-bc557". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.101811 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/721f12fa-46f1-4f0d-a57b-7f8463a83c77-kube-api-access-p2jsh" (OuterVolumeSpecName: "kube-api-access-p2jsh") pod "721f12fa-46f1-4f0d-a57b-7f8463a83c77" (UID: "721f12fa-46f1-4f0d-a57b-7f8463a83c77"). InnerVolumeSpecName "kube-api-access-p2jsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.102354 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-scripts-volume" (OuterVolumeSpecName: "scripts-volume") pod "721f12fa-46f1-4f0d-a57b-7f8463a83c77" (UID: "721f12fa-46f1-4f0d-a57b-7f8463a83c77"). InnerVolumeSpecName "scripts-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.102565 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-scripts" (OuterVolumeSpecName: "scripts") pod "f8701e09-ae1a-465a-a8c7-2249c53e372e" (UID: "f8701e09-ae1a-465a-a8c7-2249c53e372e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.130714 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "721f12fa-46f1-4f0d-a57b-7f8463a83c77" (UID: "721f12fa-46f1-4f0d-a57b-7f8463a83c77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.135923 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f8701e09-ae1a-465a-a8c7-2249c53e372e" (UID: "f8701e09-ae1a-465a-a8c7-2249c53e372e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.162401 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f8701e09-ae1a-465a-a8c7-2249c53e372e" (UID: "f8701e09-ae1a-465a-a8c7-2249c53e372e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.170638 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-config-data" (OuterVolumeSpecName: "config-data") pod "721f12fa-46f1-4f0d-a57b-7f8463a83c77" (UID: "721f12fa-46f1-4f0d-a57b-7f8463a83c77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.197221 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-config-data" (OuterVolumeSpecName: "config-data") pod "f8701e09-ae1a-465a-a8c7-2249c53e372e" (UID: "f8701e09-ae1a-465a-a8c7-2249c53e372e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.199985 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2jsh\" (UniqueName: \"kubernetes.io/projected/721f12fa-46f1-4f0d-a57b-7f8463a83c77-kube-api-access-p2jsh\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.200018 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.200031 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.200047 4744 reconciler_common.go:293] "Volume detached for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-scripts-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.200059 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721f12fa-46f1-4f0d-a57b-7f8463a83c77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.200071 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc557\" (UniqueName: \"kubernetes.io/projected/f8701e09-ae1a-465a-a8c7-2249c53e372e-kube-api-access-bc557\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.200082 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.200095 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.200106 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.203438 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8701e09-ae1a-465a-a8c7-2249c53e372e" (UID: "f8701e09-ae1a-465a-a8c7-2249c53e372e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.301193 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8701e09-ae1a-465a-a8c7-2249c53e372e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.517729 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.519401 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k" event={"ID":"721f12fa-46f1-4f0d-a57b-7f8463a83c77","Type":"ContainerDied","Data":"ac19cabfedd23fd64e50fdb8c01ad63ba20af6b577a0cd7058e761853118e7cd"} Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.519458 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac19cabfedd23fd64e50fdb8c01ad63ba20af6b577a0cd7058e761853118e7cd" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.524071 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f8701e09-ae1a-465a-a8c7-2249c53e372e","Type":"ContainerDied","Data":"887b2b69d046197e4e5ce8b9b8988c0e32ffbb5ef5d43ff9cbbedc0d7e54d756"} Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.524167 4744 scope.go:117] "RemoveContainer" containerID="72cc857daaf97271fbb3685c30567841c9fdee329faafb63de0393161163158a" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.524473 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.550949 4744 scope.go:117] "RemoveContainer" containerID="ccb528b61e4927fba7180d07b978f1677a204a6ba5a04d615d32e62a838ba605" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.580520 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.595754 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.597519 4744 scope.go:117] "RemoveContainer" containerID="f4abe7b92f2a7814cc9038c28d88b8fb685aa7f210eb3a21ae0ce8321aa6af83" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.604955 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:42:06 crc kubenswrapper[4744]: E1205 20:42:06.605373 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="721f12fa-46f1-4f0d-a57b-7f8463a83c77" containerName="watcher-db-manage" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.605396 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="721f12fa-46f1-4f0d-a57b-7f8463a83c77" containerName="watcher-db-manage" Dec 05 20:42:06 crc kubenswrapper[4744]: E1205 20:42:06.605408 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="ceilometer-central-agent" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.605417 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="ceilometer-central-agent" Dec 05 20:42:06 crc kubenswrapper[4744]: E1205 20:42:06.605443 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="proxy-httpd" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.605451 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="proxy-httpd" Dec 05 20:42:06 crc kubenswrapper[4744]: E1205 20:42:06.605467 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="sg-core" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.605474 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="sg-core" Dec 05 20:42:06 crc kubenswrapper[4744]: E1205 20:42:06.605489 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="ceilometer-notification-agent" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.605498 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="ceilometer-notification-agent" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.605671 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="ceilometer-central-agent" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.605689 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="proxy-httpd" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.605703 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="sg-core" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.605718 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" containerName="ceilometer-notification-agent" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.605732 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="721f12fa-46f1-4f0d-a57b-7f8463a83c77" containerName="watcher-db-manage" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.607544 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.615415 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.616263 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.616263 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.626437 4744 scope.go:117] "RemoveContainer" containerID="5c4a9ce4561c70839437800e1391ba03e9bb294b9ddafbcae3e14b3e9a851efa" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.661357 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.708230 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af367edc-d594-4acc-9af7-ffd743f91f28-log-httpd\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.708274 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-scripts\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.708326 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af367edc-d594-4acc-9af7-ffd743f91f28-run-httpd\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.708362 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.708472 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgnlr\" (UniqueName: \"kubernetes.io/projected/af367edc-d594-4acc-9af7-ffd743f91f28-kube-api-access-hgnlr\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.708580 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-config-data\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.708612 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.708693 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.810302 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af367edc-d594-4acc-9af7-ffd743f91f28-run-httpd\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.810375 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.810417 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgnlr\" (UniqueName: \"kubernetes.io/projected/af367edc-d594-4acc-9af7-ffd743f91f28-kube-api-access-hgnlr\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.810447 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-config-data\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.810478 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.810521 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.810581 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af367edc-d594-4acc-9af7-ffd743f91f28-log-httpd\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.810600 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-scripts\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.811872 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af367edc-d594-4acc-9af7-ffd743f91f28-log-httpd\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.811942 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af367edc-d594-4acc-9af7-ffd743f91f28-run-httpd\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.815064 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-scripts\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.815066 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.815182 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.816201 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-config-data\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.822703 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.827202 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgnlr\" (UniqueName: \"kubernetes.io/projected/af367edc-d594-4acc-9af7-ffd743f91f28-kube-api-access-hgnlr\") pod \"ceilometer-0\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:06 crc kubenswrapper[4744]: I1205 20:42:06.966062 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:07 crc kubenswrapper[4744]: I1205 20:42:07.450329 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:42:07 crc kubenswrapper[4744]: I1205 20:42:07.543342 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af367edc-d594-4acc-9af7-ffd743f91f28","Type":"ContainerStarted","Data":"ee6bf18b38ae875b9f79076a0307cd46e302e9e597d521736db79b56ed0aea99"} Dec 05 20:42:08 crc kubenswrapper[4744]: I1205 20:42:08.094084 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8701e09-ae1a-465a-a8c7-2249c53e372e" path="/var/lib/kubelet/pods/f8701e09-ae1a-465a-a8c7-2249c53e372e/volumes" Dec 05 20:42:08 crc kubenswrapper[4744]: I1205 20:42:08.559266 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af367edc-d594-4acc-9af7-ffd743f91f28","Type":"ContainerStarted","Data":"819a6bb3edd2732e26c394aedc9a2cf4fa53feb61973d067508d0f7f2e94bd15"} Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.078658 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-69895"] Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.091756 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-69895"] Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.101959 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k"] Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.110619 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29416122-j7s6k"] Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.141998 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-gjgwg"] Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.143045 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-gjgwg" Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.153732 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-gjgwg"] Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.180023 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.180504 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="44ff514e-253b-4ec3-b370-35e2aa9f6103" containerName="watcher-applier" containerID="cri-o://3816463ed5dd50dded6eba67277d9b0830527dca65e1d577dd07fea17c85ccc4" gracePeriod=30 Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.232719 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.233601 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="74a97696-88ce-4bcc-9ae0-7bf972ecc08b" containerName="watcher-decision-engine" containerID="cri-o://00e03da8880b706f8c3da6ef89466962352849f35418fc358ebdd82635328d0e" gracePeriod=30 Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.250268 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d76a425-7e8a-4443-b56c-55d8cd8483ca-operator-scripts\") pod \"watchertest-account-delete-gjgwg\" (UID: \"2d76a425-7e8a-4443-b56c-55d8cd8483ca\") " pod="watcher-kuttl-default/watchertest-account-delete-gjgwg" Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.250415 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkc7x\" (UniqueName: \"kubernetes.io/projected/2d76a425-7e8a-4443-b56c-55d8cd8483ca-kube-api-access-xkc7x\") pod \"watchertest-account-delete-gjgwg\" (UID: \"2d76a425-7e8a-4443-b56c-55d8cd8483ca\") " pod="watcher-kuttl-default/watchertest-account-delete-gjgwg" Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.272565 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.272796 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6ea37095-c0eb-4f83-b06a-561b77d1846a" containerName="watcher-kuttl-api-log" containerID="cri-o://72d371ac878e4a5548c80ad3ac8f606838e140a4c9ec401bd7cc43882122bc05" gracePeriod=30 Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.273169 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6ea37095-c0eb-4f83-b06a-561b77d1846a" containerName="watcher-api" containerID="cri-o://eb8551d493d67d6cb5fef4ab90d717d3ffcaeaaa626865ab93e657e9e3eab833" gracePeriod=30 Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.290207 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.295488 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="bac364f1-17fe-4d5e-9ce8-bc07cb076890" containerName="watcher-kuttl-api-log" containerID="cri-o://c9fd73de858fee090acd7c7512d24e927654dd268bbe71d0ba60338eb8ad5a00" gracePeriod=30 Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.295906 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="bac364f1-17fe-4d5e-9ce8-bc07cb076890" containerName="watcher-api" containerID="cri-o://237d57e31131b2906c9fbcd0d8c4216b16561a8959845da422a59ab5600f304f" gracePeriod=30 Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.353078 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d76a425-7e8a-4443-b56c-55d8cd8483ca-operator-scripts\") pod \"watchertest-account-delete-gjgwg\" (UID: \"2d76a425-7e8a-4443-b56c-55d8cd8483ca\") " pod="watcher-kuttl-default/watchertest-account-delete-gjgwg" Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.353210 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkc7x\" (UniqueName: \"kubernetes.io/projected/2d76a425-7e8a-4443-b56c-55d8cd8483ca-kube-api-access-xkc7x\") pod \"watchertest-account-delete-gjgwg\" (UID: \"2d76a425-7e8a-4443-b56c-55d8cd8483ca\") " pod="watcher-kuttl-default/watchertest-account-delete-gjgwg" Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.354344 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d76a425-7e8a-4443-b56c-55d8cd8483ca-operator-scripts\") pod \"watchertest-account-delete-gjgwg\" (UID: \"2d76a425-7e8a-4443-b56c-55d8cd8483ca\") " pod="watcher-kuttl-default/watchertest-account-delete-gjgwg" Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.376361 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkc7x\" (UniqueName: \"kubernetes.io/projected/2d76a425-7e8a-4443-b56c-55d8cd8483ca-kube-api-access-xkc7x\") pod \"watchertest-account-delete-gjgwg\" (UID: \"2d76a425-7e8a-4443-b56c-55d8cd8483ca\") " pod="watcher-kuttl-default/watchertest-account-delete-gjgwg" Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.458305 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-gjgwg" Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.599771 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af367edc-d594-4acc-9af7-ffd743f91f28","Type":"ContainerStarted","Data":"96c791445ff35ad3755db5ec0483e8691ccb6074429aac30ac5ea4c7a24143a0"} Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.601705 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ea37095-c0eb-4f83-b06a-561b77d1846a" containerID="72d371ac878e4a5548c80ad3ac8f606838e140a4c9ec401bd7cc43882122bc05" exitCode=143 Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.601768 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6ea37095-c0eb-4f83-b06a-561b77d1846a","Type":"ContainerDied","Data":"72d371ac878e4a5548c80ad3ac8f606838e140a4c9ec401bd7cc43882122bc05"} Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.608796 4744 generic.go:334] "Generic (PLEG): container finished" podID="bac364f1-17fe-4d5e-9ce8-bc07cb076890" containerID="c9fd73de858fee090acd7c7512d24e927654dd268bbe71d0ba60338eb8ad5a00" exitCode=143 Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.608867 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"bac364f1-17fe-4d5e-9ce8-bc07cb076890","Type":"ContainerDied","Data":"c9fd73de858fee090acd7c7512d24e927654dd268bbe71d0ba60338eb8ad5a00"} Dec 05 20:42:09 crc kubenswrapper[4744]: I1205 20:42:09.989451 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-gjgwg"] Dec 05 20:42:10 crc kubenswrapper[4744]: E1205 20:42:10.049353 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3816463ed5dd50dded6eba67277d9b0830527dca65e1d577dd07fea17c85ccc4" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:42:10 crc kubenswrapper[4744]: E1205 20:42:10.050622 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3816463ed5dd50dded6eba67277d9b0830527dca65e1d577dd07fea17c85ccc4" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:42:10 crc kubenswrapper[4744]: E1205 20:42:10.051842 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3816463ed5dd50dded6eba67277d9b0830527dca65e1d577dd07fea17c85ccc4" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 05 20:42:10 crc kubenswrapper[4744]: E1205 20:42:10.051871 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="44ff514e-253b-4ec3-b370-35e2aa9f6103" containerName="watcher-applier" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.089861 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c098edd-eac1-4078-831b-efc5572e94ce" path="/var/lib/kubelet/pods/5c098edd-eac1-4078-831b-efc5572e94ce/volumes" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.090360 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="721f12fa-46f1-4f0d-a57b-7f8463a83c77" path="/var/lib/kubelet/pods/721f12fa-46f1-4f0d-a57b-7f8463a83c77/volumes" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.179185 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6ea37095-c0eb-4f83-b06a-561b77d1846a" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.222:9322/\": read tcp 10.217.0.2:46600->10.217.0.222:9322: read: connection reset by peer" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.179210 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6ea37095-c0eb-4f83-b06a-561b77d1846a" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.222:9322/\": read tcp 10.217.0.2:46592->10.217.0.222:9322: read: connection reset by peer" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.571565 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.620855 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ea37095-c0eb-4f83-b06a-561b77d1846a" containerID="eb8551d493d67d6cb5fef4ab90d717d3ffcaeaaa626865ab93e657e9e3eab833" exitCode=0 Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.620917 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6ea37095-c0eb-4f83-b06a-561b77d1846a","Type":"ContainerDied","Data":"eb8551d493d67d6cb5fef4ab90d717d3ffcaeaaa626865ab93e657e9e3eab833"} Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.620944 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6ea37095-c0eb-4f83-b06a-561b77d1846a","Type":"ContainerDied","Data":"b08c8b49479f28beb1cea60e48e4bad1526b3f045d8f935dc40400cd95a09ba7"} Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.620962 4744 scope.go:117] "RemoveContainer" containerID="eb8551d493d67d6cb5fef4ab90d717d3ffcaeaaa626865ab93e657e9e3eab833" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.621085 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.625593 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af367edc-d594-4acc-9af7-ffd743f91f28","Type":"ContainerStarted","Data":"c356285c3a2f886cd56f2f4392785f2ea1b102aa6ec4bd9035f26e6cea920ed7"} Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.627283 4744 generic.go:334] "Generic (PLEG): container finished" podID="2d76a425-7e8a-4443-b56c-55d8cd8483ca" containerID="d3fd0d82f53b413bbfdc2775eb3d6c85012690f65e6bc189f19d4f34073feebf" exitCode=0 Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.627332 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchertest-account-delete-gjgwg" event={"ID":"2d76a425-7e8a-4443-b56c-55d8cd8483ca","Type":"ContainerDied","Data":"d3fd0d82f53b413bbfdc2775eb3d6c85012690f65e6bc189f19d4f34073feebf"} Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.627356 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchertest-account-delete-gjgwg" event={"ID":"2d76a425-7e8a-4443-b56c-55d8cd8483ca","Type":"ContainerStarted","Data":"4b7d4f69b45da5c3dd5636be13c98c477a079a4a525d67e32b06970002bd655b"} Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.647366 4744 scope.go:117] "RemoveContainer" containerID="72d371ac878e4a5548c80ad3ac8f606838e140a4c9ec401bd7cc43882122bc05" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.671074 4744 scope.go:117] "RemoveContainer" containerID="eb8551d493d67d6cb5fef4ab90d717d3ffcaeaaa626865ab93e657e9e3eab833" Dec 05 20:42:10 crc kubenswrapper[4744]: E1205 20:42:10.671644 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb8551d493d67d6cb5fef4ab90d717d3ffcaeaaa626865ab93e657e9e3eab833\": container with ID starting with eb8551d493d67d6cb5fef4ab90d717d3ffcaeaaa626865ab93e657e9e3eab833 not found: ID does not exist" containerID="eb8551d493d67d6cb5fef4ab90d717d3ffcaeaaa626865ab93e657e9e3eab833" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.671675 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb8551d493d67d6cb5fef4ab90d717d3ffcaeaaa626865ab93e657e9e3eab833"} err="failed to get container status \"eb8551d493d67d6cb5fef4ab90d717d3ffcaeaaa626865ab93e657e9e3eab833\": rpc error: code = NotFound desc = could not find container \"eb8551d493d67d6cb5fef4ab90d717d3ffcaeaaa626865ab93e657e9e3eab833\": container with ID starting with eb8551d493d67d6cb5fef4ab90d717d3ffcaeaaa626865ab93e657e9e3eab833 not found: ID does not exist" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.671710 4744 scope.go:117] "RemoveContainer" containerID="72d371ac878e4a5548c80ad3ac8f606838e140a4c9ec401bd7cc43882122bc05" Dec 05 20:42:10 crc kubenswrapper[4744]: E1205 20:42:10.671989 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d371ac878e4a5548c80ad3ac8f606838e140a4c9ec401bd7cc43882122bc05\": container with ID starting with 72d371ac878e4a5548c80ad3ac8f606838e140a4c9ec401bd7cc43882122bc05 not found: ID does not exist" containerID="72d371ac878e4a5548c80ad3ac8f606838e140a4c9ec401bd7cc43882122bc05" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.672028 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d371ac878e4a5548c80ad3ac8f606838e140a4c9ec401bd7cc43882122bc05"} err="failed to get container status \"72d371ac878e4a5548c80ad3ac8f606838e140a4c9ec401bd7cc43882122bc05\": rpc error: code = NotFound desc = could not find container \"72d371ac878e4a5548c80ad3ac8f606838e140a4c9ec401bd7cc43882122bc05\": container with ID starting with 72d371ac878e4a5548c80ad3ac8f606838e140a4c9ec401bd7cc43882122bc05 not found: ID does not exist" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.678928 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-combined-ca-bundle\") pod \"6ea37095-c0eb-4f83-b06a-561b77d1846a\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.678980 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea37095-c0eb-4f83-b06a-561b77d1846a-logs\") pod \"6ea37095-c0eb-4f83-b06a-561b77d1846a\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.679032 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-cert-memcached-mtls\") pod \"6ea37095-c0eb-4f83-b06a-561b77d1846a\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.679088 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-custom-prometheus-ca\") pod \"6ea37095-c0eb-4f83-b06a-561b77d1846a\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.679129 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-config-data\") pod \"6ea37095-c0eb-4f83-b06a-561b77d1846a\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.679185 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nb25\" (UniqueName: \"kubernetes.io/projected/6ea37095-c0eb-4f83-b06a-561b77d1846a-kube-api-access-5nb25\") pod \"6ea37095-c0eb-4f83-b06a-561b77d1846a\" (UID: \"6ea37095-c0eb-4f83-b06a-561b77d1846a\") " Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.683694 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ea37095-c0eb-4f83-b06a-561b77d1846a-logs" (OuterVolumeSpecName: "logs") pod "6ea37095-c0eb-4f83-b06a-561b77d1846a" (UID: "6ea37095-c0eb-4f83-b06a-561b77d1846a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.699482 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea37095-c0eb-4f83-b06a-561b77d1846a-kube-api-access-5nb25" (OuterVolumeSpecName: "kube-api-access-5nb25") pod "6ea37095-c0eb-4f83-b06a-561b77d1846a" (UID: "6ea37095-c0eb-4f83-b06a-561b77d1846a"). InnerVolumeSpecName "kube-api-access-5nb25". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.707576 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ea37095-c0eb-4f83-b06a-561b77d1846a" (UID: "6ea37095-c0eb-4f83-b06a-561b77d1846a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.708885 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "6ea37095-c0eb-4f83-b06a-561b77d1846a" (UID: "6ea37095-c0eb-4f83-b06a-561b77d1846a"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.731872 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-config-data" (OuterVolumeSpecName: "config-data") pod "6ea37095-c0eb-4f83-b06a-561b77d1846a" (UID: "6ea37095-c0eb-4f83-b06a-561b77d1846a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.763380 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "6ea37095-c0eb-4f83-b06a-561b77d1846a" (UID: "6ea37095-c0eb-4f83-b06a-561b77d1846a"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.781200 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.781255 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.781267 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nb25\" (UniqueName: \"kubernetes.io/projected/6ea37095-c0eb-4f83-b06a-561b77d1846a-kube-api-access-5nb25\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.781279 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.781314 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea37095-c0eb-4f83-b06a-561b77d1846a-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.781328 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/6ea37095-c0eb-4f83-b06a-561b77d1846a-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.950959 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="bac364f1-17fe-4d5e-9ce8-bc07cb076890" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.224:9322/\": read tcp 10.217.0.2:38528->10.217.0.224:9322: read: connection reset by peer" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.951535 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="bac364f1-17fe-4d5e-9ce8-bc07cb076890" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.224:9322/\": read tcp 10.217.0.2:38522->10.217.0.224:9322: read: connection reset by peer" Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.982172 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:42:10 crc kubenswrapper[4744]: I1205 20:42:10.996820 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.315997 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.402100 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bac364f1-17fe-4d5e-9ce8-bc07cb076890-logs\") pod \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.402149 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq545\" (UniqueName: \"kubernetes.io/projected/bac364f1-17fe-4d5e-9ce8-bc07cb076890-kube-api-access-zq545\") pod \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.402273 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-cert-memcached-mtls\") pod \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.402330 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-config-data\") pod \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.402381 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-custom-prometheus-ca\") pod \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.402413 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-combined-ca-bundle\") pod \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\" (UID: \"bac364f1-17fe-4d5e-9ce8-bc07cb076890\") " Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.405498 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bac364f1-17fe-4d5e-9ce8-bc07cb076890-logs" (OuterVolumeSpecName: "logs") pod "bac364f1-17fe-4d5e-9ce8-bc07cb076890" (UID: "bac364f1-17fe-4d5e-9ce8-bc07cb076890"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.415181 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bac364f1-17fe-4d5e-9ce8-bc07cb076890-kube-api-access-zq545" (OuterVolumeSpecName: "kube-api-access-zq545") pod "bac364f1-17fe-4d5e-9ce8-bc07cb076890" (UID: "bac364f1-17fe-4d5e-9ce8-bc07cb076890"). InnerVolumeSpecName "kube-api-access-zq545". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.427409 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bac364f1-17fe-4d5e-9ce8-bc07cb076890" (UID: "bac364f1-17fe-4d5e-9ce8-bc07cb076890"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.431689 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "bac364f1-17fe-4d5e-9ce8-bc07cb076890" (UID: "bac364f1-17fe-4d5e-9ce8-bc07cb076890"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.472514 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-config-data" (OuterVolumeSpecName: "config-data") pod "bac364f1-17fe-4d5e-9ce8-bc07cb076890" (UID: "bac364f1-17fe-4d5e-9ce8-bc07cb076890"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.504150 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.504181 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.504189 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.504198 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bac364f1-17fe-4d5e-9ce8-bc07cb076890-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.504207 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq545\" (UniqueName: \"kubernetes.io/projected/bac364f1-17fe-4d5e-9ce8-bc07cb076890-kube-api-access-zq545\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.515396 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "bac364f1-17fe-4d5e-9ce8-bc07cb076890" (UID: "bac364f1-17fe-4d5e-9ce8-bc07cb076890"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.606173 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/bac364f1-17fe-4d5e-9ce8-bc07cb076890-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.639114 4744 generic.go:334] "Generic (PLEG): container finished" podID="bac364f1-17fe-4d5e-9ce8-bc07cb076890" containerID="237d57e31131b2906c9fbcd0d8c4216b16561a8959845da422a59ab5600f304f" exitCode=0 Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.639160 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"bac364f1-17fe-4d5e-9ce8-bc07cb076890","Type":"ContainerDied","Data":"237d57e31131b2906c9fbcd0d8c4216b16561a8959845da422a59ab5600f304f"} Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.639183 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"bac364f1-17fe-4d5e-9ce8-bc07cb076890","Type":"ContainerDied","Data":"cf1b90d2252000bcce90e62cb2097cec60aa27dc7f1075c0cc96820aaadbf867"} Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.639200 4744 scope.go:117] "RemoveContainer" containerID="237d57e31131b2906c9fbcd0d8c4216b16561a8959845da422a59ab5600f304f" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.639283 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.650830 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af367edc-d594-4acc-9af7-ffd743f91f28","Type":"ContainerStarted","Data":"f4b44e19adf72c984596d362a7f1b4b37cdcebb41d3f09cc863fcc8dd2d0dd74"} Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.651487 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.678122 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.330288097 podStartE2EDuration="5.678105228s" podCreationTimestamp="2025-12-05 20:42:06 +0000 UTC" firstStartedPulling="2025-12-05 20:42:07.453075759 +0000 UTC m=+1897.682887137" lastFinishedPulling="2025-12-05 20:42:10.8008929 +0000 UTC m=+1901.030704268" observedRunningTime="2025-12-05 20:42:11.670554884 +0000 UTC m=+1901.900366252" watchObservedRunningTime="2025-12-05 20:42:11.678105228 +0000 UTC m=+1901.907916596" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.680062 4744 scope.go:117] "RemoveContainer" containerID="c9fd73de858fee090acd7c7512d24e927654dd268bbe71d0ba60338eb8ad5a00" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.713030 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.717398 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.752466 4744 scope.go:117] "RemoveContainer" containerID="237d57e31131b2906c9fbcd0d8c4216b16561a8959845da422a59ab5600f304f" Dec 05 20:42:11 crc kubenswrapper[4744]: E1205 20:42:11.753836 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"237d57e31131b2906c9fbcd0d8c4216b16561a8959845da422a59ab5600f304f\": container with ID starting with 237d57e31131b2906c9fbcd0d8c4216b16561a8959845da422a59ab5600f304f not found: ID does not exist" containerID="237d57e31131b2906c9fbcd0d8c4216b16561a8959845da422a59ab5600f304f" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.753887 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237d57e31131b2906c9fbcd0d8c4216b16561a8959845da422a59ab5600f304f"} err="failed to get container status \"237d57e31131b2906c9fbcd0d8c4216b16561a8959845da422a59ab5600f304f\": rpc error: code = NotFound desc = could not find container \"237d57e31131b2906c9fbcd0d8c4216b16561a8959845da422a59ab5600f304f\": container with ID starting with 237d57e31131b2906c9fbcd0d8c4216b16561a8959845da422a59ab5600f304f not found: ID does not exist" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.753919 4744 scope.go:117] "RemoveContainer" containerID="c9fd73de858fee090acd7c7512d24e927654dd268bbe71d0ba60338eb8ad5a00" Dec 05 20:42:11 crc kubenswrapper[4744]: E1205 20:42:11.754658 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9fd73de858fee090acd7c7512d24e927654dd268bbe71d0ba60338eb8ad5a00\": container with ID starting with c9fd73de858fee090acd7c7512d24e927654dd268bbe71d0ba60338eb8ad5a00 not found: ID does not exist" containerID="c9fd73de858fee090acd7c7512d24e927654dd268bbe71d0ba60338eb8ad5a00" Dec 05 20:42:11 crc kubenswrapper[4744]: I1205 20:42:11.754695 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9fd73de858fee090acd7c7512d24e927654dd268bbe71d0ba60338eb8ad5a00"} err="failed to get container status \"c9fd73de858fee090acd7c7512d24e927654dd268bbe71d0ba60338eb8ad5a00\": rpc error: code = NotFound desc = could not find container \"c9fd73de858fee090acd7c7512d24e927654dd268bbe71d0ba60338eb8ad5a00\": container with ID starting with c9fd73de858fee090acd7c7512d24e927654dd268bbe71d0ba60338eb8ad5a00 not found: ID does not exist" Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.140282 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea37095-c0eb-4f83-b06a-561b77d1846a" path="/var/lib/kubelet/pods/6ea37095-c0eb-4f83-b06a-561b77d1846a/volumes" Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.140987 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bac364f1-17fe-4d5e-9ce8-bc07cb076890" path="/var/lib/kubelet/pods/bac364f1-17fe-4d5e-9ce8-bc07cb076890/volumes" Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.213929 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-gjgwg" Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.347904 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkc7x\" (UniqueName: \"kubernetes.io/projected/2d76a425-7e8a-4443-b56c-55d8cd8483ca-kube-api-access-xkc7x\") pod \"2d76a425-7e8a-4443-b56c-55d8cd8483ca\" (UID: \"2d76a425-7e8a-4443-b56c-55d8cd8483ca\") " Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.348361 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d76a425-7e8a-4443-b56c-55d8cd8483ca-operator-scripts\") pod \"2d76a425-7e8a-4443-b56c-55d8cd8483ca\" (UID: \"2d76a425-7e8a-4443-b56c-55d8cd8483ca\") " Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.348815 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d76a425-7e8a-4443-b56c-55d8cd8483ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d76a425-7e8a-4443-b56c-55d8cd8483ca" (UID: "2d76a425-7e8a-4443-b56c-55d8cd8483ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.353538 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d76a425-7e8a-4443-b56c-55d8cd8483ca-kube-api-access-xkc7x" (OuterVolumeSpecName: "kube-api-access-xkc7x") pod "2d76a425-7e8a-4443-b56c-55d8cd8483ca" (UID: "2d76a425-7e8a-4443-b56c-55d8cd8483ca"). InnerVolumeSpecName "kube-api-access-xkc7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.416602 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.450757 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkc7x\" (UniqueName: \"kubernetes.io/projected/2d76a425-7e8a-4443-b56c-55d8cd8483ca-kube-api-access-xkc7x\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.450796 4744 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d76a425-7e8a-4443-b56c-55d8cd8483ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.667896 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchertest-account-delete-gjgwg" event={"ID":"2d76a425-7e8a-4443-b56c-55d8cd8483ca","Type":"ContainerDied","Data":"4b7d4f69b45da5c3dd5636be13c98c477a079a4a525d67e32b06970002bd655b"} Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.667943 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b7d4f69b45da5c3dd5636be13c98c477a079a4a525d67e32b06970002bd655b" Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.668007 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-gjgwg" Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.709516 4744 generic.go:334] "Generic (PLEG): container finished" podID="44ff514e-253b-4ec3-b370-35e2aa9f6103" containerID="3816463ed5dd50dded6eba67277d9b0830527dca65e1d577dd07fea17c85ccc4" exitCode=0 Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.710232 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"44ff514e-253b-4ec3-b370-35e2aa9f6103","Type":"ContainerDied","Data":"3816463ed5dd50dded6eba67277d9b0830527dca65e1d577dd07fea17c85ccc4"} Dec 05 20:42:12 crc kubenswrapper[4744]: I1205 20:42:12.931118 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.076769 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ff69\" (UniqueName: \"kubernetes.io/projected/44ff514e-253b-4ec3-b370-35e2aa9f6103-kube-api-access-5ff69\") pod \"44ff514e-253b-4ec3-b370-35e2aa9f6103\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.076895 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-combined-ca-bundle\") pod \"44ff514e-253b-4ec3-b370-35e2aa9f6103\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.076997 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-cert-memcached-mtls\") pod \"44ff514e-253b-4ec3-b370-35e2aa9f6103\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.077022 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-config-data\") pod \"44ff514e-253b-4ec3-b370-35e2aa9f6103\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.077138 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44ff514e-253b-4ec3-b370-35e2aa9f6103-logs\") pod \"44ff514e-253b-4ec3-b370-35e2aa9f6103\" (UID: \"44ff514e-253b-4ec3-b370-35e2aa9f6103\") " Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.077789 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44ff514e-253b-4ec3-b370-35e2aa9f6103-logs" (OuterVolumeSpecName: "logs") pod "44ff514e-253b-4ec3-b370-35e2aa9f6103" (UID: "44ff514e-253b-4ec3-b370-35e2aa9f6103"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.091669 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ff514e-253b-4ec3-b370-35e2aa9f6103-kube-api-access-5ff69" (OuterVolumeSpecName: "kube-api-access-5ff69") pod "44ff514e-253b-4ec3-b370-35e2aa9f6103" (UID: "44ff514e-253b-4ec3-b370-35e2aa9f6103"). InnerVolumeSpecName "kube-api-access-5ff69". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.113396 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44ff514e-253b-4ec3-b370-35e2aa9f6103" (UID: "44ff514e-253b-4ec3-b370-35e2aa9f6103"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.163507 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "44ff514e-253b-4ec3-b370-35e2aa9f6103" (UID: "44ff514e-253b-4ec3-b370-35e2aa9f6103"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.180437 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-config-data" (OuterVolumeSpecName: "config-data") pod "44ff514e-253b-4ec3-b370-35e2aa9f6103" (UID: "44ff514e-253b-4ec3-b370-35e2aa9f6103"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.181461 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44ff514e-253b-4ec3-b370-35e2aa9f6103-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.181484 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ff69\" (UniqueName: \"kubernetes.io/projected/44ff514e-253b-4ec3-b370-35e2aa9f6103-kube-api-access-5ff69\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.181494 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.181502 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.181509 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ff514e-253b-4ec3-b370-35e2aa9f6103-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.241781 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.383971 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-cert-memcached-mtls\") pod \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.384024 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-logs\") pod \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.384172 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxxlm\" (UniqueName: \"kubernetes.io/projected/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-kube-api-access-hxxlm\") pod \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.384191 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-config-data\") pod \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.384396 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-logs" (OuterVolumeSpecName: "logs") pod "74a97696-88ce-4bcc-9ae0-7bf972ecc08b" (UID: "74a97696-88ce-4bcc-9ae0-7bf972ecc08b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.384561 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-combined-ca-bundle\") pod \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.384632 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-custom-prometheus-ca\") pod \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\" (UID: \"74a97696-88ce-4bcc-9ae0-7bf972ecc08b\") " Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.384946 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-logs\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.388437 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-kube-api-access-hxxlm" (OuterVolumeSpecName: "kube-api-access-hxxlm") pod "74a97696-88ce-4bcc-9ae0-7bf972ecc08b" (UID: "74a97696-88ce-4bcc-9ae0-7bf972ecc08b"). InnerVolumeSpecName "kube-api-access-hxxlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.407040 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "74a97696-88ce-4bcc-9ae0-7bf972ecc08b" (UID: "74a97696-88ce-4bcc-9ae0-7bf972ecc08b"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.417088 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74a97696-88ce-4bcc-9ae0-7bf972ecc08b" (UID: "74a97696-88ce-4bcc-9ae0-7bf972ecc08b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.434813 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-config-data" (OuterVolumeSpecName: "config-data") pod "74a97696-88ce-4bcc-9ae0-7bf972ecc08b" (UID: "74a97696-88ce-4bcc-9ae0-7bf972ecc08b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.452381 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "74a97696-88ce-4bcc-9ae0-7bf972ecc08b" (UID: "74a97696-88ce-4bcc-9ae0-7bf972ecc08b"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.486350 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.486638 4744 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.486701 4744 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.486785 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxxlm\" (UniqueName: \"kubernetes.io/projected/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-kube-api-access-hxxlm\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.486842 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a97696-88ce-4bcc-9ae0-7bf972ecc08b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.723149 4744 generic.go:334] "Generic (PLEG): container finished" podID="74a97696-88ce-4bcc-9ae0-7bf972ecc08b" containerID="00e03da8880b706f8c3da6ef89466962352849f35418fc358ebdd82635328d0e" exitCode=0 Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.723221 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.723239 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"74a97696-88ce-4bcc-9ae0-7bf972ecc08b","Type":"ContainerDied","Data":"00e03da8880b706f8c3da6ef89466962352849f35418fc358ebdd82635328d0e"} Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.729706 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"74a97696-88ce-4bcc-9ae0-7bf972ecc08b","Type":"ContainerDied","Data":"eae3d4c33b08a362cf30f49ef24c8b6ce6f4ba7a510042e59c3831429d426a1c"} Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.729749 4744 scope.go:117] "RemoveContainer" containerID="00e03da8880b706f8c3da6ef89466962352849f35418fc358ebdd82635328d0e" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.732040 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="ceilometer-central-agent" containerID="cri-o://819a6bb3edd2732e26c394aedc9a2cf4fa53feb61973d067508d0f7f2e94bd15" gracePeriod=30 Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.732226 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="sg-core" containerID="cri-o://c356285c3a2f886cd56f2f4392785f2ea1b102aa6ec4bd9035f26e6cea920ed7" gracePeriod=30 Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.732353 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="ceilometer-notification-agent" containerID="cri-o://96c791445ff35ad3755db5ec0483e8691ccb6074429aac30ac5ea4c7a24143a0" gracePeriod=30 Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.732415 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"44ff514e-253b-4ec3-b370-35e2aa9f6103","Type":"ContainerDied","Data":"e22016c983aa1d35b9e9e3dc15cbc7e9417030b776d9f667fecdbb1196c3137f"} Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.732476 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.732626 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="proxy-httpd" containerID="cri-o://f4b44e19adf72c984596d362a7f1b4b37cdcebb41d3f09cc863fcc8dd2d0dd74" gracePeriod=30 Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.765105 4744 scope.go:117] "RemoveContainer" containerID="00e03da8880b706f8c3da6ef89466962352849f35418fc358ebdd82635328d0e" Dec 05 20:42:13 crc kubenswrapper[4744]: E1205 20:42:13.765706 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e03da8880b706f8c3da6ef89466962352849f35418fc358ebdd82635328d0e\": container with ID starting with 00e03da8880b706f8c3da6ef89466962352849f35418fc358ebdd82635328d0e not found: ID does not exist" containerID="00e03da8880b706f8c3da6ef89466962352849f35418fc358ebdd82635328d0e" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.765766 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e03da8880b706f8c3da6ef89466962352849f35418fc358ebdd82635328d0e"} err="failed to get container status \"00e03da8880b706f8c3da6ef89466962352849f35418fc358ebdd82635328d0e\": rpc error: code = NotFound desc = could not find container \"00e03da8880b706f8c3da6ef89466962352849f35418fc358ebdd82635328d0e\": container with ID starting with 00e03da8880b706f8c3da6ef89466962352849f35418fc358ebdd82635328d0e not found: ID does not exist" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.765824 4744 scope.go:117] "RemoveContainer" containerID="3816463ed5dd50dded6eba67277d9b0830527dca65e1d577dd07fea17c85ccc4" Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.773100 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.779965 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.825713 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:42:13 crc kubenswrapper[4744]: I1205 20:42:13.834179 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.091782 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44ff514e-253b-4ec3-b370-35e2aa9f6103" path="/var/lib/kubelet/pods/44ff514e-253b-4ec3-b370-35e2aa9f6103/volumes" Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.092758 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74a97696-88ce-4bcc-9ae0-7bf972ecc08b" path="/var/lib/kubelet/pods/74a97696-88ce-4bcc-9ae0-7bf972ecc08b/volumes" Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.179889 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-4mwtz"] Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.194500 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-4mwtz"] Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.201762 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-gjgwg"] Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.208923 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-update-vmm54"] Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.215381 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-gjgwg"] Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.222348 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-update-vmm54"] Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.789421 4744 generic.go:334] "Generic (PLEG): container finished" podID="af367edc-d594-4acc-9af7-ffd743f91f28" containerID="f4b44e19adf72c984596d362a7f1b4b37cdcebb41d3f09cc863fcc8dd2d0dd74" exitCode=0 Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.789458 4744 generic.go:334] "Generic (PLEG): container finished" podID="af367edc-d594-4acc-9af7-ffd743f91f28" containerID="c356285c3a2f886cd56f2f4392785f2ea1b102aa6ec4bd9035f26e6cea920ed7" exitCode=2 Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.789469 4744 generic.go:334] "Generic (PLEG): container finished" podID="af367edc-d594-4acc-9af7-ffd743f91f28" containerID="96c791445ff35ad3755db5ec0483e8691ccb6074429aac30ac5ea4c7a24143a0" exitCode=0 Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.789477 4744 generic.go:334] "Generic (PLEG): container finished" podID="af367edc-d594-4acc-9af7-ffd743f91f28" containerID="819a6bb3edd2732e26c394aedc9a2cf4fa53feb61973d067508d0f7f2e94bd15" exitCode=0 Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.789478 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af367edc-d594-4acc-9af7-ffd743f91f28","Type":"ContainerDied","Data":"f4b44e19adf72c984596d362a7f1b4b37cdcebb41d3f09cc863fcc8dd2d0dd74"} Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.789511 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af367edc-d594-4acc-9af7-ffd743f91f28","Type":"ContainerDied","Data":"c356285c3a2f886cd56f2f4392785f2ea1b102aa6ec4bd9035f26e6cea920ed7"} Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.789524 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af367edc-d594-4acc-9af7-ffd743f91f28","Type":"ContainerDied","Data":"96c791445ff35ad3755db5ec0483e8691ccb6074429aac30ac5ea4c7a24143a0"} Dec 05 20:42:14 crc kubenswrapper[4744]: I1205 20:42:14.789557 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af367edc-d594-4acc-9af7-ffd743f91f28","Type":"ContainerDied","Data":"819a6bb3edd2732e26c394aedc9a2cf4fa53feb61973d067508d0f7f2e94bd15"} Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.030528 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-mrwcv"] Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.035462 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-mrwcv"] Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.060241 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.213701 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-combined-ca-bundle\") pod \"af367edc-d594-4acc-9af7-ffd743f91f28\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.213970 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af367edc-d594-4acc-9af7-ffd743f91f28-run-httpd\") pod \"af367edc-d594-4acc-9af7-ffd743f91f28\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.214109 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-scripts\") pod \"af367edc-d594-4acc-9af7-ffd743f91f28\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.214195 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgnlr\" (UniqueName: \"kubernetes.io/projected/af367edc-d594-4acc-9af7-ffd743f91f28-kube-api-access-hgnlr\") pod \"af367edc-d594-4acc-9af7-ffd743f91f28\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.214313 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-ceilometer-tls-certs\") pod \"af367edc-d594-4acc-9af7-ffd743f91f28\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.214397 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-config-data\") pod \"af367edc-d594-4acc-9af7-ffd743f91f28\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.214508 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af367edc-d594-4acc-9af7-ffd743f91f28-log-httpd\") pod \"af367edc-d594-4acc-9af7-ffd743f91f28\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.214594 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-sg-core-conf-yaml\") pod \"af367edc-d594-4acc-9af7-ffd743f91f28\" (UID: \"af367edc-d594-4acc-9af7-ffd743f91f28\") " Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.215042 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af367edc-d594-4acc-9af7-ffd743f91f28-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "af367edc-d594-4acc-9af7-ffd743f91f28" (UID: "af367edc-d594-4acc-9af7-ffd743f91f28"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.216977 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af367edc-d594-4acc-9af7-ffd743f91f28-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "af367edc-d594-4acc-9af7-ffd743f91f28" (UID: "af367edc-d594-4acc-9af7-ffd743f91f28"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.221018 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-scripts" (OuterVolumeSpecName: "scripts") pod "af367edc-d594-4acc-9af7-ffd743f91f28" (UID: "af367edc-d594-4acc-9af7-ffd743f91f28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.221029 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af367edc-d594-4acc-9af7-ffd743f91f28-kube-api-access-hgnlr" (OuterVolumeSpecName: "kube-api-access-hgnlr") pod "af367edc-d594-4acc-9af7-ffd743f91f28" (UID: "af367edc-d594-4acc-9af7-ffd743f91f28"). InnerVolumeSpecName "kube-api-access-hgnlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.323056 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af367edc-d594-4acc-9af7-ffd743f91f28-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.323091 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.323108 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgnlr\" (UniqueName: \"kubernetes.io/projected/af367edc-d594-4acc-9af7-ffd743f91f28-kube-api-access-hgnlr\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.323120 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af367edc-d594-4acc-9af7-ffd743f91f28-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.345436 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "af367edc-d594-4acc-9af7-ffd743f91f28" (UID: "af367edc-d594-4acc-9af7-ffd743f91f28"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.345463 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af367edc-d594-4acc-9af7-ffd743f91f28" (UID: "af367edc-d594-4acc-9af7-ffd743f91f28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.366354 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "af367edc-d594-4acc-9af7-ffd743f91f28" (UID: "af367edc-d594-4acc-9af7-ffd743f91f28"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.369141 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-config-data" (OuterVolumeSpecName: "config-data") pod "af367edc-d594-4acc-9af7-ffd743f91f28" (UID: "af367edc-d594-4acc-9af7-ffd743f91f28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.424125 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.424160 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.424169 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.424177 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af367edc-d594-4acc-9af7-ffd743f91f28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.804348 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af367edc-d594-4acc-9af7-ffd743f91f28","Type":"ContainerDied","Data":"ee6bf18b38ae875b9f79076a0307cd46e302e9e597d521736db79b56ed0aea99"} Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.804395 4744 scope.go:117] "RemoveContainer" containerID="f4b44e19adf72c984596d362a7f1b4b37cdcebb41d3f09cc863fcc8dd2d0dd74" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.804618 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.838992 4744 scope.go:117] "RemoveContainer" containerID="c356285c3a2f886cd56f2f4392785f2ea1b102aa6ec4bd9035f26e6cea920ed7" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.866033 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.876420 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.881454 4744 scope.go:117] "RemoveContainer" containerID="96c791445ff35ad3755db5ec0483e8691ccb6074429aac30ac5ea4c7a24143a0" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.890342 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:42:15 crc kubenswrapper[4744]: E1205 20:42:15.890717 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ff514e-253b-4ec3-b370-35e2aa9f6103" containerName="watcher-applier" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.890732 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ff514e-253b-4ec3-b370-35e2aa9f6103" containerName="watcher-applier" Dec 05 20:42:15 crc kubenswrapper[4744]: E1205 20:42:15.890752 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d76a425-7e8a-4443-b56c-55d8cd8483ca" containerName="mariadb-account-delete" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.890760 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d76a425-7e8a-4443-b56c-55d8cd8483ca" containerName="mariadb-account-delete" Dec 05 20:42:15 crc kubenswrapper[4744]: E1205 20:42:15.890772 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bac364f1-17fe-4d5e-9ce8-bc07cb076890" containerName="watcher-api" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.890779 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bac364f1-17fe-4d5e-9ce8-bc07cb076890" containerName="watcher-api" Dec 05 20:42:15 crc kubenswrapper[4744]: E1205 20:42:15.890794 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="ceilometer-central-agent" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.890801 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="ceilometer-central-agent" Dec 05 20:42:15 crc kubenswrapper[4744]: E1205 20:42:15.890817 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74a97696-88ce-4bcc-9ae0-7bf972ecc08b" containerName="watcher-decision-engine" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.890825 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a97696-88ce-4bcc-9ae0-7bf972ecc08b" containerName="watcher-decision-engine" Dec 05 20:42:15 crc kubenswrapper[4744]: E1205 20:42:15.890844 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="ceilometer-notification-agent" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.890851 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="ceilometer-notification-agent" Dec 05 20:42:15 crc kubenswrapper[4744]: E1205 20:42:15.890867 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="proxy-httpd" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.890874 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="proxy-httpd" Dec 05 20:42:15 crc kubenswrapper[4744]: E1205 20:42:15.890885 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="sg-core" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.890892 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="sg-core" Dec 05 20:42:15 crc kubenswrapper[4744]: E1205 20:42:15.890903 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bac364f1-17fe-4d5e-9ce8-bc07cb076890" containerName="watcher-kuttl-api-log" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.890911 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bac364f1-17fe-4d5e-9ce8-bc07cb076890" containerName="watcher-kuttl-api-log" Dec 05 20:42:15 crc kubenswrapper[4744]: E1205 20:42:15.890924 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea37095-c0eb-4f83-b06a-561b77d1846a" containerName="watcher-api" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.890931 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea37095-c0eb-4f83-b06a-561b77d1846a" containerName="watcher-api" Dec 05 20:42:15 crc kubenswrapper[4744]: E1205 20:42:15.890940 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea37095-c0eb-4f83-b06a-561b77d1846a" containerName="watcher-kuttl-api-log" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.890948 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea37095-c0eb-4f83-b06a-561b77d1846a" containerName="watcher-kuttl-api-log" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.891119 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea37095-c0eb-4f83-b06a-561b77d1846a" containerName="watcher-kuttl-api-log" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.891138 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ff514e-253b-4ec3-b370-35e2aa9f6103" containerName="watcher-applier" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.891151 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="bac364f1-17fe-4d5e-9ce8-bc07cb076890" containerName="watcher-api" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.891165 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="74a97696-88ce-4bcc-9ae0-7bf972ecc08b" containerName="watcher-decision-engine" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.891178 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d76a425-7e8a-4443-b56c-55d8cd8483ca" containerName="mariadb-account-delete" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.891189 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="ceilometer-central-agent" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.891201 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="bac364f1-17fe-4d5e-9ce8-bc07cb076890" containerName="watcher-kuttl-api-log" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.891216 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="proxy-httpd" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.891225 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea37095-c0eb-4f83-b06a-561b77d1846a" containerName="watcher-api" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.891232 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="ceilometer-notification-agent" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.891241 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" containerName="sg-core" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.906052 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.906378 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.912004 4744 scope.go:117] "RemoveContainer" containerID="819a6bb3edd2732e26c394aedc9a2cf4fa53feb61973d067508d0f7f2e94bd15" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.912686 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.912838 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 05 20:42:15 crc kubenswrapper[4744]: I1205 20:42:15.912696 4744 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.036574 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9ab744-e215-4bca-a2d4-53969e490cdb-scripts\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.036652 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab9ab744-e215-4bca-a2d4-53969e490cdb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.036676 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9ab744-e215-4bca-a2d4-53969e490cdb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.036739 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9htpj\" (UniqueName: \"kubernetes.io/projected/ab9ab744-e215-4bca-a2d4-53969e490cdb-kube-api-access-9htpj\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.037038 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab744-e215-4bca-a2d4-53969e490cdb-run-httpd\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.037150 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9ab744-e215-4bca-a2d4-53969e490cdb-config-data\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.037240 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9ab744-e215-4bca-a2d4-53969e490cdb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.037337 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab744-e215-4bca-a2d4-53969e490cdb-log-httpd\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.081156 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:42:16 crc kubenswrapper[4744]: E1205 20:42:16.081434 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.099276 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="215b84b6-8bdb-4102-9ef1-80ef9f6a538e" path="/var/lib/kubelet/pods/215b84b6-8bdb-4102-9ef1-80ef9f6a538e/volumes" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.100588 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d76a425-7e8a-4443-b56c-55d8cd8483ca" path="/var/lib/kubelet/pods/2d76a425-7e8a-4443-b56c-55d8cd8483ca/volumes" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.101795 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af367edc-d594-4acc-9af7-ffd743f91f28" path="/var/lib/kubelet/pods/af367edc-d594-4acc-9af7-ffd743f91f28/volumes" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.104186 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f26fa528-ef83-4870-ba3d-ae08e92b47b9" path="/var/lib/kubelet/pods/f26fa528-ef83-4870-ba3d-ae08e92b47b9/volumes" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.105505 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f" path="/var/lib/kubelet/pods/f3b4ac43-f6ca-4b90-8e8b-feeb18e3bc9f/volumes" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.144119 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab9ab744-e215-4bca-a2d4-53969e490cdb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.144190 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9ab744-e215-4bca-a2d4-53969e490cdb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.144311 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9htpj\" (UniqueName: \"kubernetes.io/projected/ab9ab744-e215-4bca-a2d4-53969e490cdb-kube-api-access-9htpj\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.144389 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab744-e215-4bca-a2d4-53969e490cdb-run-httpd\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.144477 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9ab744-e215-4bca-a2d4-53969e490cdb-config-data\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.144564 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9ab744-e215-4bca-a2d4-53969e490cdb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.144613 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab744-e215-4bca-a2d4-53969e490cdb-log-httpd\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.144710 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9ab744-e215-4bca-a2d4-53969e490cdb-scripts\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.146000 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab744-e215-4bca-a2d4-53969e490cdb-run-httpd\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.146103 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab9ab744-e215-4bca-a2d4-53969e490cdb-log-httpd\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.147782 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab9ab744-e215-4bca-a2d4-53969e490cdb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.149769 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9ab744-e215-4bca-a2d4-53969e490cdb-scripts\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.151248 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9ab744-e215-4bca-a2d4-53969e490cdb-config-data\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.152788 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9ab744-e215-4bca-a2d4-53969e490cdb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.153722 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9ab744-e215-4bca-a2d4-53969e490cdb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.170853 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9htpj\" (UniqueName: \"kubernetes.io/projected/ab9ab744-e215-4bca-a2d4-53969e490cdb-kube-api-access-9htpj\") pod \"ceilometer-0\" (UID: \"ab9ab744-e215-4bca-a2d4-53969e490cdb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.239751 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.727345 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 05 20:42:16 crc kubenswrapper[4744]: I1205 20:42:16.816143 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ab9ab744-e215-4bca-a2d4-53969e490cdb","Type":"ContainerStarted","Data":"4bf1486d787e78aeea5d008ef3d3f8faa78b79ed812ae96fbf394fc7e33eab2a"} Dec 05 20:42:17 crc kubenswrapper[4744]: I1205 20:42:17.827684 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ab9ab744-e215-4bca-a2d4-53969e490cdb","Type":"ContainerStarted","Data":"b8a8b6dab217d865ce0ac714da26b9523a94e66a7b205c1c83e2ebe491171571"} Dec 05 20:42:18 crc kubenswrapper[4744]: I1205 20:42:18.838344 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ab9ab744-e215-4bca-a2d4-53969e490cdb","Type":"ContainerStarted","Data":"642e5ad71e3b5a0cdb119df4be7f4dda23becb1a0eac3660b47d08345c24d86a"} Dec 05 20:42:18 crc kubenswrapper[4744]: I1205 20:42:18.838799 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ab9ab744-e215-4bca-a2d4-53969e490cdb","Type":"ContainerStarted","Data":"d671f6e4ff1fcfc7fe40a8bd34885e1174ee7e66ba6e4b013250294077655865"} Dec 05 20:42:19 crc kubenswrapper[4744]: I1205 20:42:19.848429 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ab9ab744-e215-4bca-a2d4-53969e490cdb","Type":"ContainerStarted","Data":"cc2dda92df7c6bc8737f9ff339c02ae85c2459a65956707e6095e51e22557ed6"} Dec 05 20:42:19 crc kubenswrapper[4744]: I1205 20:42:19.848869 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:19 crc kubenswrapper[4744]: I1205 20:42:19.873668 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.108492642 podStartE2EDuration="4.87364264s" podCreationTimestamp="2025-12-05 20:42:15 +0000 UTC" firstStartedPulling="2025-12-05 20:42:16.736802602 +0000 UTC m=+1906.966613990" lastFinishedPulling="2025-12-05 20:42:19.50195262 +0000 UTC m=+1909.731763988" observedRunningTime="2025-12-05 20:42:19.865625274 +0000 UTC m=+1910.095436642" watchObservedRunningTime="2025-12-05 20:42:19.87364264 +0000 UTC m=+1910.103454018" Dec 05 20:42:31 crc kubenswrapper[4744]: I1205 20:42:31.080734 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:42:31 crc kubenswrapper[4744]: E1205 20:42:31.082099 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:42:36 crc kubenswrapper[4744]: I1205 20:42:36.746433 4744 scope.go:117] "RemoveContainer" containerID="d839f00b5da1fdd7f5c48fcbcc62959786f1353040ce510d3f6bd821e83b0e9a" Dec 05 20:42:36 crc kubenswrapper[4744]: I1205 20:42:36.798484 4744 scope.go:117] "RemoveContainer" containerID="9de0d954d9caf7c5d5eb39e4e457edec9303f4562c1386f703c80fe2776f91a8" Dec 05 20:42:36 crc kubenswrapper[4744]: I1205 20:42:36.838229 4744 scope.go:117] "RemoveContainer" containerID="33e5c90f8e50d62e3036d65b4ce2f67a2477301cd44398ba9da4ffd2ca6283be" Dec 05 20:42:36 crc kubenswrapper[4744]: I1205 20:42:36.906514 4744 scope.go:117] "RemoveContainer" containerID="ed8f9472f629c57d540ad8173288862985f4214695222941bf27f9f2da8817c9" Dec 05 20:42:36 crc kubenswrapper[4744]: I1205 20:42:36.932450 4744 scope.go:117] "RemoveContainer" containerID="d3dc5d164c98cce0dd3794f22e4f0b9a0f892af71335254d651126c1503f99c3" Dec 05 20:42:44 crc kubenswrapper[4744]: I1205 20:42:44.080941 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:42:44 crc kubenswrapper[4744]: E1205 20:42:44.081720 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:42:46 crc kubenswrapper[4744]: I1205 20:42:46.259354 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 05 20:42:50 crc kubenswrapper[4744]: I1205 20:42:50.128084 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fb5tg/must-gather-xft2c"] Dec 05 20:42:50 crc kubenswrapper[4744]: I1205 20:42:50.129989 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fb5tg/must-gather-xft2c" Dec 05 20:42:50 crc kubenswrapper[4744]: I1205 20:42:50.144836 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fb5tg"/"default-dockercfg-2cdpz" Dec 05 20:42:50 crc kubenswrapper[4744]: I1205 20:42:50.145341 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fb5tg"/"kube-root-ca.crt" Dec 05 20:42:50 crc kubenswrapper[4744]: I1205 20:42:50.145389 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fb5tg"/"openshift-service-ca.crt" Dec 05 20:42:50 crc kubenswrapper[4744]: I1205 20:42:50.163835 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fb5tg/must-gather-xft2c"] Dec 05 20:42:50 crc kubenswrapper[4744]: I1205 20:42:50.238275 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7631d304-47c0-4814-825c-c1c7297c585c-must-gather-output\") pod \"must-gather-xft2c\" (UID: \"7631d304-47c0-4814-825c-c1c7297c585c\") " pod="openshift-must-gather-fb5tg/must-gather-xft2c" Dec 05 20:42:50 crc kubenswrapper[4744]: I1205 20:42:50.238667 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whkxg\" (UniqueName: \"kubernetes.io/projected/7631d304-47c0-4814-825c-c1c7297c585c-kube-api-access-whkxg\") pod \"must-gather-xft2c\" (UID: \"7631d304-47c0-4814-825c-c1c7297c585c\") " pod="openshift-must-gather-fb5tg/must-gather-xft2c" Dec 05 20:42:50 crc kubenswrapper[4744]: I1205 20:42:50.340104 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whkxg\" (UniqueName: \"kubernetes.io/projected/7631d304-47c0-4814-825c-c1c7297c585c-kube-api-access-whkxg\") pod \"must-gather-xft2c\" (UID: \"7631d304-47c0-4814-825c-c1c7297c585c\") " pod="openshift-must-gather-fb5tg/must-gather-xft2c" Dec 05 20:42:50 crc kubenswrapper[4744]: I1205 20:42:50.340202 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7631d304-47c0-4814-825c-c1c7297c585c-must-gather-output\") pod \"must-gather-xft2c\" (UID: \"7631d304-47c0-4814-825c-c1c7297c585c\") " pod="openshift-must-gather-fb5tg/must-gather-xft2c" Dec 05 20:42:50 crc kubenswrapper[4744]: I1205 20:42:50.340690 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7631d304-47c0-4814-825c-c1c7297c585c-must-gather-output\") pod \"must-gather-xft2c\" (UID: \"7631d304-47c0-4814-825c-c1c7297c585c\") " pod="openshift-must-gather-fb5tg/must-gather-xft2c" Dec 05 20:42:50 crc kubenswrapper[4744]: I1205 20:42:50.358948 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whkxg\" (UniqueName: \"kubernetes.io/projected/7631d304-47c0-4814-825c-c1c7297c585c-kube-api-access-whkxg\") pod \"must-gather-xft2c\" (UID: \"7631d304-47c0-4814-825c-c1c7297c585c\") " pod="openshift-must-gather-fb5tg/must-gather-xft2c" Dec 05 20:42:50 crc kubenswrapper[4744]: I1205 20:42:50.465201 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fb5tg/must-gather-xft2c" Dec 05 20:42:51 crc kubenswrapper[4744]: I1205 20:42:51.043556 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fb5tg/must-gather-xft2c"] Dec 05 20:42:51 crc kubenswrapper[4744]: I1205 20:42:51.164040 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fb5tg/must-gather-xft2c" event={"ID":"7631d304-47c0-4814-825c-c1c7297c585c","Type":"ContainerStarted","Data":"8ffa7c056126d835b3b159e226aca3653a0f51c774c2cbc898a2cde5337c9424"} Dec 05 20:42:55 crc kubenswrapper[4744]: I1205 20:42:55.081029 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:42:55 crc kubenswrapper[4744]: E1205 20:42:55.081864 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:42:57 crc kubenswrapper[4744]: I1205 20:42:57.225202 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fb5tg/must-gather-xft2c" event={"ID":"7631d304-47c0-4814-825c-c1c7297c585c","Type":"ContainerStarted","Data":"aa846b5ebe0a154a1376e841b7539b114804221c20f27495888783f01650e25a"} Dec 05 20:42:57 crc kubenswrapper[4744]: I1205 20:42:57.227798 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fb5tg/must-gather-xft2c" event={"ID":"7631d304-47c0-4814-825c-c1c7297c585c","Type":"ContainerStarted","Data":"f64d1209699ec16e3a68ac3ba712b6631cf52a29015e940a6c2007f26702a8ea"} Dec 05 20:42:57 crc kubenswrapper[4744]: I1205 20:42:57.251678 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fb5tg/must-gather-xft2c" podStartSLOduration=2.764971284 podStartE2EDuration="7.251659943s" podCreationTimestamp="2025-12-05 20:42:50 +0000 UTC" firstStartedPulling="2025-12-05 20:42:51.050990811 +0000 UTC m=+1941.280802179" lastFinishedPulling="2025-12-05 20:42:55.53767947 +0000 UTC m=+1945.767490838" observedRunningTime="2025-12-05 20:42:57.237409246 +0000 UTC m=+1947.467220634" watchObservedRunningTime="2025-12-05 20:42:57.251659943 +0000 UTC m=+1947.481471331" Dec 05 20:43:09 crc kubenswrapper[4744]: I1205 20:43:09.080658 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:43:09 crc kubenswrapper[4744]: E1205 20:43:09.081430 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:43:24 crc kubenswrapper[4744]: I1205 20:43:24.081403 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:43:24 crc kubenswrapper[4744]: E1205 20:43:24.082049 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:43:37 crc kubenswrapper[4744]: I1205 20:43:37.182836 4744 scope.go:117] "RemoveContainer" containerID="b5417a2f9d6daa4ff819c2650d9c6bee168a458c4c0ed892c21a7cd9ad34871e" Dec 05 20:43:39 crc kubenswrapper[4744]: I1205 20:43:39.080208 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:43:39 crc kubenswrapper[4744]: E1205 20:43:39.080739 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:43:49 crc kubenswrapper[4744]: I1205 20:43:49.692028 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6xttb"] Dec 05 20:43:49 crc kubenswrapper[4744]: I1205 20:43:49.694827 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:43:49 crc kubenswrapper[4744]: I1205 20:43:49.703507 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6xttb"] Dec 05 20:43:49 crc kubenswrapper[4744]: I1205 20:43:49.720367 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3705bc3a-9189-4ed9-937b-bdc167887481-catalog-content\") pod \"redhat-operators-6xttb\" (UID: \"3705bc3a-9189-4ed9-937b-bdc167887481\") " pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:43:49 crc kubenswrapper[4744]: I1205 20:43:49.720477 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbjbz\" (UniqueName: \"kubernetes.io/projected/3705bc3a-9189-4ed9-937b-bdc167887481-kube-api-access-rbjbz\") pod \"redhat-operators-6xttb\" (UID: \"3705bc3a-9189-4ed9-937b-bdc167887481\") " pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:43:49 crc kubenswrapper[4744]: I1205 20:43:49.720507 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3705bc3a-9189-4ed9-937b-bdc167887481-utilities\") pod \"redhat-operators-6xttb\" (UID: \"3705bc3a-9189-4ed9-937b-bdc167887481\") " pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:43:49 crc kubenswrapper[4744]: I1205 20:43:49.822869 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3705bc3a-9189-4ed9-937b-bdc167887481-catalog-content\") pod \"redhat-operators-6xttb\" (UID: \"3705bc3a-9189-4ed9-937b-bdc167887481\") " pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:43:49 crc kubenswrapper[4744]: I1205 20:43:49.823451 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbjbz\" (UniqueName: \"kubernetes.io/projected/3705bc3a-9189-4ed9-937b-bdc167887481-kube-api-access-rbjbz\") pod \"redhat-operators-6xttb\" (UID: \"3705bc3a-9189-4ed9-937b-bdc167887481\") " pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:43:49 crc kubenswrapper[4744]: I1205 20:43:49.823500 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3705bc3a-9189-4ed9-937b-bdc167887481-utilities\") pod \"redhat-operators-6xttb\" (UID: \"3705bc3a-9189-4ed9-937b-bdc167887481\") " pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:43:49 crc kubenswrapper[4744]: I1205 20:43:49.824131 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3705bc3a-9189-4ed9-937b-bdc167887481-catalog-content\") pod \"redhat-operators-6xttb\" (UID: \"3705bc3a-9189-4ed9-937b-bdc167887481\") " pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:43:49 crc kubenswrapper[4744]: I1205 20:43:49.825959 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3705bc3a-9189-4ed9-937b-bdc167887481-utilities\") pod \"redhat-operators-6xttb\" (UID: \"3705bc3a-9189-4ed9-937b-bdc167887481\") " pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:43:49 crc kubenswrapper[4744]: I1205 20:43:49.864972 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbjbz\" (UniqueName: \"kubernetes.io/projected/3705bc3a-9189-4ed9-937b-bdc167887481-kube-api-access-rbjbz\") pod \"redhat-operators-6xttb\" (UID: \"3705bc3a-9189-4ed9-937b-bdc167887481\") " pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:43:50 crc kubenswrapper[4744]: I1205 20:43:50.059498 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:43:50 crc kubenswrapper[4744]: I1205 20:43:50.513766 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6xttb"] Dec 05 20:43:50 crc kubenswrapper[4744]: I1205 20:43:50.656251 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xttb" event={"ID":"3705bc3a-9189-4ed9-937b-bdc167887481","Type":"ContainerStarted","Data":"792a2a78da6284b48a03329e4214861ad58054f2430c366a342f5b9d5c461249"} Dec 05 20:43:51 crc kubenswrapper[4744]: I1205 20:43:51.667062 4744 generic.go:334] "Generic (PLEG): container finished" podID="3705bc3a-9189-4ed9-937b-bdc167887481" containerID="2231b0e5ad1184659a35b5735db1005c62c45a506ba55ce3ed289a6709df92ae" exitCode=0 Dec 05 20:43:51 crc kubenswrapper[4744]: I1205 20:43:51.667124 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xttb" event={"ID":"3705bc3a-9189-4ed9-937b-bdc167887481","Type":"ContainerDied","Data":"2231b0e5ad1184659a35b5735db1005c62c45a506ba55ce3ed289a6709df92ae"} Dec 05 20:43:51 crc kubenswrapper[4744]: I1205 20:43:51.670611 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:43:52 crc kubenswrapper[4744]: I1205 20:43:52.676511 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xttb" event={"ID":"3705bc3a-9189-4ed9-937b-bdc167887481","Type":"ContainerStarted","Data":"cbb084559a815a2666aed86dce81090eb22b0479cc55648089c4855cfef7589c"} Dec 05 20:43:53 crc kubenswrapper[4744]: I1205 20:43:53.081205 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:43:53 crc kubenswrapper[4744]: I1205 20:43:53.685611 4744 generic.go:334] "Generic (PLEG): container finished" podID="3705bc3a-9189-4ed9-937b-bdc167887481" containerID="cbb084559a815a2666aed86dce81090eb22b0479cc55648089c4855cfef7589c" exitCode=0 Dec 05 20:43:53 crc kubenswrapper[4744]: I1205 20:43:53.685775 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xttb" event={"ID":"3705bc3a-9189-4ed9-937b-bdc167887481","Type":"ContainerDied","Data":"cbb084559a815a2666aed86dce81090eb22b0479cc55648089c4855cfef7589c"} Dec 05 20:43:53 crc kubenswrapper[4744]: I1205 20:43:53.695245 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerStarted","Data":"868cd4bd451bd54b5a700cd8156999c8957e86cb450992ed62513f6e758cc078"} Dec 05 20:43:54 crc kubenswrapper[4744]: I1205 20:43:54.709102 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xttb" event={"ID":"3705bc3a-9189-4ed9-937b-bdc167887481","Type":"ContainerStarted","Data":"fbfac2d2454063459ba4b3f3db606d7003fdf0776a431285a5cda2b14f8312db"} Dec 05 20:43:55 crc kubenswrapper[4744]: I1205 20:43:55.736956 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6xttb" podStartSLOduration=4.248933433 podStartE2EDuration="6.736939345s" podCreationTimestamp="2025-12-05 20:43:49 +0000 UTC" firstStartedPulling="2025-12-05 20:43:51.67035744 +0000 UTC m=+2001.900168808" lastFinishedPulling="2025-12-05 20:43:54.158363342 +0000 UTC m=+2004.388174720" observedRunningTime="2025-12-05 20:43:55.733726776 +0000 UTC m=+2005.963538134" watchObservedRunningTime="2025-12-05 20:43:55.736939345 +0000 UTC m=+2005.966750713" Dec 05 20:44:00 crc kubenswrapper[4744]: I1205 20:44:00.060343 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:44:00 crc kubenswrapper[4744]: I1205 20:44:00.060901 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:44:01 crc kubenswrapper[4744]: I1205 20:44:01.110263 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6xttb" podUID="3705bc3a-9189-4ed9-937b-bdc167887481" containerName="registry-server" probeResult="failure" output=< Dec 05 20:44:01 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Dec 05 20:44:01 crc kubenswrapper[4744]: > Dec 05 20:44:02 crc kubenswrapper[4744]: I1205 20:44:02.724265 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rw45s_3c35e949-b9a5-4f22-a2db-7a2f27b4bb8f/manager/0.log" Dec 05 20:44:02 crc kubenswrapper[4744]: I1205 20:44:02.733810 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rw45s_3c35e949-b9a5-4f22-a2db-7a2f27b4bb8f/kube-rbac-proxy/0.log" Dec 05 20:44:02 crc kubenswrapper[4744]: I1205 20:44:02.926761 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6_b06ca702-ab58-4297-ab66-f8fbc71358e5/util/0.log" Dec 05 20:44:03 crc kubenswrapper[4744]: I1205 20:44:03.072093 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6_b06ca702-ab58-4297-ab66-f8fbc71358e5/pull/0.log" Dec 05 20:44:03 crc kubenswrapper[4744]: I1205 20:44:03.104973 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6_b06ca702-ab58-4297-ab66-f8fbc71358e5/pull/0.log" Dec 05 20:44:03 crc kubenswrapper[4744]: I1205 20:44:03.126684 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6_b06ca702-ab58-4297-ab66-f8fbc71358e5/util/0.log" Dec 05 20:44:03 crc kubenswrapper[4744]: I1205 20:44:03.369413 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6_b06ca702-ab58-4297-ab66-f8fbc71358e5/extract/0.log" Dec 05 20:44:03 crc kubenswrapper[4744]: I1205 20:44:03.375804 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6_b06ca702-ab58-4297-ab66-f8fbc71358e5/pull/0.log" Dec 05 20:44:03 crc kubenswrapper[4744]: I1205 20:44:03.416986 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66mlvv6_b06ca702-ab58-4297-ab66-f8fbc71358e5/util/0.log" Dec 05 20:44:03 crc kubenswrapper[4744]: I1205 20:44:03.569367 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-kj76m_b82be17d-c46f-4d8d-9264-d51d1b2ef12e/kube-rbac-proxy/0.log" Dec 05 20:44:03 crc kubenswrapper[4744]: I1205 20:44:03.615550 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-kj76m_b82be17d-c46f-4d8d-9264-d51d1b2ef12e/manager/0.log" Dec 05 20:44:03 crc kubenswrapper[4744]: I1205 20:44:03.677971 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq_81b7a1b4-a09c-4c8c-841d-5a8d8deed699/util/0.log" Dec 05 20:44:03 crc kubenswrapper[4744]: I1205 20:44:03.866062 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq_81b7a1b4-a09c-4c8c-841d-5a8d8deed699/pull/0.log" Dec 05 20:44:03 crc kubenswrapper[4744]: I1205 20:44:03.874821 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq_81b7a1b4-a09c-4c8c-841d-5a8d8deed699/util/0.log" Dec 05 20:44:03 crc kubenswrapper[4744]: I1205 20:44:03.903462 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq_81b7a1b4-a09c-4c8c-841d-5a8d8deed699/pull/0.log" Dec 05 20:44:04 crc kubenswrapper[4744]: I1205 20:44:04.029489 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq_81b7a1b4-a09c-4c8c-841d-5a8d8deed699/util/0.log" Dec 05 20:44:04 crc kubenswrapper[4744]: I1205 20:44:04.036700 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq_81b7a1b4-a09c-4c8c-841d-5a8d8deed699/pull/0.log" Dec 05 20:44:04 crc kubenswrapper[4744]: I1205 20:44:04.101686 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d513b2c00466755487b974e4bf14ce6bfc6f4128ef0df5a7237e9c1412s85tq_81b7a1b4-a09c-4c8c-841d-5a8d8deed699/extract/0.log" Dec 05 20:44:04 crc kubenswrapper[4744]: I1205 20:44:04.204714 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-b4sx5_15e7fd30-4c0d-45f6-8905-ab235fc32e16/kube-rbac-proxy/0.log" Dec 05 20:44:04 crc kubenswrapper[4744]: I1205 20:44:04.214003 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-b4sx5_15e7fd30-4c0d-45f6-8905-ab235fc32e16/manager/0.log" Dec 05 20:44:04 crc kubenswrapper[4744]: I1205 20:44:04.306484 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-8pp8k_e038d15c-67e9-4551-b13b-c541b4b76827/kube-rbac-proxy/0.log" Dec 05 20:44:04 crc kubenswrapper[4744]: I1205 20:44:04.374082 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-8pp8k_e038d15c-67e9-4551-b13b-c541b4b76827/manager/0.log" Dec 05 20:44:04 crc kubenswrapper[4744]: I1205 20:44:04.467230 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-n4ljw_a4ad9153-5de0-4bb5-a419-fe70e3099450/manager/0.log" Dec 05 20:44:04 crc kubenswrapper[4744]: I1205 20:44:04.498504 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-n4ljw_a4ad9153-5de0-4bb5-a419-fe70e3099450/kube-rbac-proxy/0.log" Dec 05 20:44:04 crc kubenswrapper[4744]: I1205 20:44:04.615206 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-mmzcc_6c19a4e0-7d3f-44b2-9e16-8e7cdc24ab74/kube-rbac-proxy/0.log" Dec 05 20:44:04 crc kubenswrapper[4744]: I1205 20:44:04.646717 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-mmzcc_6c19a4e0-7d3f-44b2-9e16-8e7cdc24ab74/manager/0.log" Dec 05 20:44:04 crc kubenswrapper[4744]: I1205 20:44:04.816892 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-5bfxc_341519c2-107a-440a-bfbb-af937e0c681f/kube-rbac-proxy/0.log" Dec 05 20:44:04 crc kubenswrapper[4744]: I1205 20:44:04.948405 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-5bfxc_341519c2-107a-440a-bfbb-af937e0c681f/manager/0.log" Dec 05 20:44:05 crc kubenswrapper[4744]: I1205 20:44:05.007679 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-4dm28_8656f7db-cb1e-40fa-ba97-93a647f869ac/kube-rbac-proxy/0.log" Dec 05 20:44:05 crc kubenswrapper[4744]: I1205 20:44:05.074620 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-4dm28_8656f7db-cb1e-40fa-ba97-93a647f869ac/manager/0.log" Dec 05 20:44:05 crc kubenswrapper[4744]: I1205 20:44:05.140305 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-57gjb_021ea569-b351-4d31-8080-75f5ec005daa/kube-rbac-proxy/0.log" Dec 05 20:44:05 crc kubenswrapper[4744]: I1205 20:44:05.269869 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-57gjb_021ea569-b351-4d31-8080-75f5ec005daa/manager/0.log" Dec 05 20:44:05 crc kubenswrapper[4744]: I1205 20:44:05.374656 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-879df_022c2e13-58dd-42d3-a3a4-91a3eb74e0b5/manager/0.log" Dec 05 20:44:05 crc kubenswrapper[4744]: I1205 20:44:05.375021 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-879df_022c2e13-58dd-42d3-a3a4-91a3eb74e0b5/kube-rbac-proxy/0.log" Dec 05 20:44:05 crc kubenswrapper[4744]: I1205 20:44:05.464024 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-m8844_69215470-ec91-4d88-99f1-99117a543086/kube-rbac-proxy/0.log" Dec 05 20:44:05 crc kubenswrapper[4744]: I1205 20:44:05.621204 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-m8844_69215470-ec91-4d88-99f1-99117a543086/manager/0.log" Dec 05 20:44:05 crc kubenswrapper[4744]: I1205 20:44:05.642765 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-dffqv_0d2bbe6b-5adb-402a-8ef6-d7be819d5b73/manager/0.log" Dec 05 20:44:05 crc kubenswrapper[4744]: I1205 20:44:05.659962 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-dffqv_0d2bbe6b-5adb-402a-8ef6-d7be819d5b73/kube-rbac-proxy/0.log" Dec 05 20:44:05 crc kubenswrapper[4744]: I1205 20:44:05.834870 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-6pdfn_f3194afc-f21e-4fb0-bc31-5ac4b1b6e434/kube-rbac-proxy/0.log" Dec 05 20:44:05 crc kubenswrapper[4744]: I1205 20:44:05.838302 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-6pdfn_f3194afc-f21e-4fb0-bc31-5ac4b1b6e434/manager/0.log" Dec 05 20:44:05 crc kubenswrapper[4744]: I1205 20:44:05.996036 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-kdhmb_2c3d0695-b544-47a5-ad85-36f8fd2f1dcb/kube-rbac-proxy/0.log" Dec 05 20:44:06 crc kubenswrapper[4744]: I1205 20:44:06.011254 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-kdhmb_2c3d0695-b544-47a5-ad85-36f8fd2f1dcb/manager/0.log" Dec 05 20:44:06 crc kubenswrapper[4744]: I1205 20:44:06.015376 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m_43979be1-9cc5-445f-b079-b4504355cce4/kube-rbac-proxy/0.log" Dec 05 20:44:06 crc kubenswrapper[4744]: I1205 20:44:06.050372 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4ht46m_43979be1-9cc5-445f-b079-b4504355cce4/manager/0.log" Dec 05 20:44:06 crc kubenswrapper[4744]: I1205 20:44:06.269212 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-nxws4_9b8a4464-7411-4fb6-9078-c922921d4a65/registry-server/0.log" Dec 05 20:44:06 crc kubenswrapper[4744]: I1205 20:44:06.364039 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-qqjj8_588bb9d2-d747-43cb-8e9f-73d1961bebf1/kube-rbac-proxy/0.log" Dec 05 20:44:06 crc kubenswrapper[4744]: I1205 20:44:06.388829 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-qqjj8_588bb9d2-d747-43cb-8e9f-73d1961bebf1/manager/0.log" Dec 05 20:44:06 crc kubenswrapper[4744]: I1205 20:44:06.527153 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4b9p2_807989ca-0470-47bc-8bef-9c1dd35e4bb0/kube-rbac-proxy/0.log" Dec 05 20:44:06 crc kubenswrapper[4744]: I1205 20:44:06.611699 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4b9p2_807989ca-0470-47bc-8bef-9c1dd35e4bb0/manager/0.log" Dec 05 20:44:06 crc kubenswrapper[4744]: I1205 20:44:06.790704 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-9t8ch_4aae801a-e589-469a-b153-116744edc63b/kube-rbac-proxy/0.log" Dec 05 20:44:06 crc kubenswrapper[4744]: I1205 20:44:06.796327 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-lh9hc_41b48b2f-7b8d-46df-a226-6c163e4f57b0/operator/0.log" Dec 05 20:44:06 crc kubenswrapper[4744]: I1205 20:44:06.862485 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-9t8ch_4aae801a-e589-469a-b153-116744edc63b/manager/0.log" Dec 05 20:44:07 crc kubenswrapper[4744]: I1205 20:44:07.041363 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7dc867b75-npt5k_63391739-cc08-49ea-be59-2c0740078450/manager/0.log" Dec 05 20:44:07 crc kubenswrapper[4744]: I1205 20:44:07.097827 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-v879l_5e4a6d16-0c89-4bd1-aa53-ce798baff113/kube-rbac-proxy/0.log" Dec 05 20:44:07 crc kubenswrapper[4744]: I1205 20:44:07.199077 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-v879l_5e4a6d16-0c89-4bd1-aa53-ce798baff113/manager/0.log" Dec 05 20:44:07 crc kubenswrapper[4744]: I1205 20:44:07.392409 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-lbxdx_564e07f4-0673-42d8-a7ee-68366706b2d4/manager/0.log" Dec 05 20:44:07 crc kubenswrapper[4744]: I1205 20:44:07.429302 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-lbxdx_564e07f4-0673-42d8-a7ee-68366706b2d4/kube-rbac-proxy/0.log" Dec 05 20:44:07 crc kubenswrapper[4744]: I1205 20:44:07.618654 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-index-t4s2v_3018fddf-6d8b-437d-9a8b-dd585664b159/registry-server/0.log" Dec 05 20:44:07 crc kubenswrapper[4744]: I1205 20:44:07.795238 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd9866b7f-kbvtl_ab409bd9-6116-4c63-b990-0bdf2214420a/manager/0.log" Dec 05 20:44:10 crc kubenswrapper[4744]: I1205 20:44:10.107883 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:44:10 crc kubenswrapper[4744]: I1205 20:44:10.149630 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:44:13 crc kubenswrapper[4744]: I1205 20:44:13.679644 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6xttb"] Dec 05 20:44:13 crc kubenswrapper[4744]: I1205 20:44:13.680269 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6xttb" podUID="3705bc3a-9189-4ed9-937b-bdc167887481" containerName="registry-server" containerID="cri-o://fbfac2d2454063459ba4b3f3db606d7003fdf0776a431285a5cda2b14f8312db" gracePeriod=2 Dec 05 20:44:13 crc kubenswrapper[4744]: I1205 20:44:13.884686 4744 generic.go:334] "Generic (PLEG): container finished" podID="3705bc3a-9189-4ed9-937b-bdc167887481" containerID="fbfac2d2454063459ba4b3f3db606d7003fdf0776a431285a5cda2b14f8312db" exitCode=0 Dec 05 20:44:13 crc kubenswrapper[4744]: I1205 20:44:13.884727 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xttb" event={"ID":"3705bc3a-9189-4ed9-937b-bdc167887481","Type":"ContainerDied","Data":"fbfac2d2454063459ba4b3f3db606d7003fdf0776a431285a5cda2b14f8312db"} Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.114179 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.300024 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3705bc3a-9189-4ed9-937b-bdc167887481-catalog-content\") pod \"3705bc3a-9189-4ed9-937b-bdc167887481\" (UID: \"3705bc3a-9189-4ed9-937b-bdc167887481\") " Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.300073 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbjbz\" (UniqueName: \"kubernetes.io/projected/3705bc3a-9189-4ed9-937b-bdc167887481-kube-api-access-rbjbz\") pod \"3705bc3a-9189-4ed9-937b-bdc167887481\" (UID: \"3705bc3a-9189-4ed9-937b-bdc167887481\") " Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.300145 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3705bc3a-9189-4ed9-937b-bdc167887481-utilities\") pod \"3705bc3a-9189-4ed9-937b-bdc167887481\" (UID: \"3705bc3a-9189-4ed9-937b-bdc167887481\") " Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.301102 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3705bc3a-9189-4ed9-937b-bdc167887481-utilities" (OuterVolumeSpecName: "utilities") pod "3705bc3a-9189-4ed9-937b-bdc167887481" (UID: "3705bc3a-9189-4ed9-937b-bdc167887481"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.306474 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3705bc3a-9189-4ed9-937b-bdc167887481-kube-api-access-rbjbz" (OuterVolumeSpecName: "kube-api-access-rbjbz") pod "3705bc3a-9189-4ed9-937b-bdc167887481" (UID: "3705bc3a-9189-4ed9-937b-bdc167887481"). InnerVolumeSpecName "kube-api-access-rbjbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.401890 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbjbz\" (UniqueName: \"kubernetes.io/projected/3705bc3a-9189-4ed9-937b-bdc167887481-kube-api-access-rbjbz\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.402163 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3705bc3a-9189-4ed9-937b-bdc167887481-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.417177 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3705bc3a-9189-4ed9-937b-bdc167887481-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3705bc3a-9189-4ed9-937b-bdc167887481" (UID: "3705bc3a-9189-4ed9-937b-bdc167887481"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.504189 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3705bc3a-9189-4ed9-937b-bdc167887481-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.894839 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xttb" event={"ID":"3705bc3a-9189-4ed9-937b-bdc167887481","Type":"ContainerDied","Data":"792a2a78da6284b48a03329e4214861ad58054f2430c366a342f5b9d5c461249"} Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.894899 4744 scope.go:117] "RemoveContainer" containerID="fbfac2d2454063459ba4b3f3db606d7003fdf0776a431285a5cda2b14f8312db" Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.894910 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6xttb" Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.919500 4744 scope.go:117] "RemoveContainer" containerID="cbb084559a815a2666aed86dce81090eb22b0479cc55648089c4855cfef7589c" Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.924285 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6xttb"] Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.944812 4744 scope.go:117] "RemoveContainer" containerID="2231b0e5ad1184659a35b5735db1005c62c45a506ba55ce3ed289a6709df92ae" Dec 05 20:44:14 crc kubenswrapper[4744]: I1205 20:44:14.944937 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6xttb"] Dec 05 20:44:16 crc kubenswrapper[4744]: I1205 20:44:16.090996 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3705bc3a-9189-4ed9-937b-bdc167887481" path="/var/lib/kubelet/pods/3705bc3a-9189-4ed9-937b-bdc167887481/volumes" Dec 05 20:44:28 crc kubenswrapper[4744]: I1205 20:44:28.969881 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ql5gr_3532c9be-fdf5-43e2-b5ba-95a678fef5f8/control-plane-machine-set-operator/0.log" Dec 05 20:44:29 crc kubenswrapper[4744]: I1205 20:44:29.137060 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-r5krf_818b6964-1c62-4e2e-8079-a41f9bdcb763/kube-rbac-proxy/0.log" Dec 05 20:44:29 crc kubenswrapper[4744]: I1205 20:44:29.159470 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-r5krf_818b6964-1c62-4e2e-8079-a41f9bdcb763/machine-api-operator/0.log" Dec 05 20:44:37 crc kubenswrapper[4744]: I1205 20:44:37.233889 4744 scope.go:117] "RemoveContainer" containerID="d6af28c30fd182c01e93121217eb4289e991e04ea88e4ebc54e24ac262b456ec" Dec 05 20:44:37 crc kubenswrapper[4744]: I1205 20:44:37.256889 4744 scope.go:117] "RemoveContainer" containerID="61d8e1ee54fa5ae2bb38cedfe1e6cd1c7ec76f3bd9a9db2c2ee2a46e4f6eacc8" Dec 05 20:44:37 crc kubenswrapper[4744]: I1205 20:44:37.302504 4744 scope.go:117] "RemoveContainer" containerID="f7f0337b8e3263e4aac2458de206366665aea3ca88a1b92883c2f0a74b496d99" Dec 05 20:44:37 crc kubenswrapper[4744]: I1205 20:44:37.324200 4744 scope.go:117] "RemoveContainer" containerID="f39cdcde712ce7599672b1d10ffc0ef29b126b35da5654e8619e591f3bd328b5" Dec 05 20:44:43 crc kubenswrapper[4744]: I1205 20:44:43.888272 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-t8449_5275d248-c1ed-4aab-a01f-1e7c65cfc66a/cert-manager-controller/0.log" Dec 05 20:44:44 crc kubenswrapper[4744]: I1205 20:44:44.041460 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-fqrqd_69280567-04f6-4557-8203-d729b6ec814e/cert-manager-cainjector/0.log" Dec 05 20:44:44 crc kubenswrapper[4744]: I1205 20:44:44.098449 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-qxrzz_e86d2f0d-d888-42bd-9781-46aecfcf2a65/cert-manager-webhook/0.log" Dec 05 20:44:57 crc kubenswrapper[4744]: I1205 20:44:57.989035 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-4vdgs_b2e18700-c2eb-4cae-8be4-4463b8a8071a/nmstate-console-plugin/0.log" Dec 05 20:44:58 crc kubenswrapper[4744]: I1205 20:44:58.173119 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fhw4s_5e13deba-1699-48ea-9085-425e98206f8d/nmstate-handler/0.log" Dec 05 20:44:58 crc kubenswrapper[4744]: I1205 20:44:58.200458 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-nzkjk_6df3f631-039c-4df3-a991-9775663959e3/kube-rbac-proxy/0.log" Dec 05 20:44:58 crc kubenswrapper[4744]: I1205 20:44:58.246709 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-nzkjk_6df3f631-039c-4df3-a991-9775663959e3/nmstate-metrics/0.log" Dec 05 20:44:58 crc kubenswrapper[4744]: I1205 20:44:58.381977 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-6gznm_b9185029-82a1-4112-9539-86612a761dd9/nmstate-operator/0.log" Dec 05 20:44:58 crc kubenswrapper[4744]: I1205 20:44:58.451771 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-4jr5z_dc5309cc-3e62-4f06-94ac-c7b938ff5373/nmstate-webhook/0.log" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.142352 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr"] Dec 05 20:45:00 crc kubenswrapper[4744]: E1205 20:45:00.142916 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3705bc3a-9189-4ed9-937b-bdc167887481" containerName="registry-server" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.142931 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3705bc3a-9189-4ed9-937b-bdc167887481" containerName="registry-server" Dec 05 20:45:00 crc kubenswrapper[4744]: E1205 20:45:00.142950 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3705bc3a-9189-4ed9-937b-bdc167887481" containerName="extract-utilities" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.142958 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3705bc3a-9189-4ed9-937b-bdc167887481" containerName="extract-utilities" Dec 05 20:45:00 crc kubenswrapper[4744]: E1205 20:45:00.142986 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3705bc3a-9189-4ed9-937b-bdc167887481" containerName="extract-content" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.142994 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3705bc3a-9189-4ed9-937b-bdc167887481" containerName="extract-content" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.143146 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3705bc3a-9189-4ed9-937b-bdc167887481" containerName="registry-server" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.143714 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.145718 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.147196 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.158402 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr"] Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.258409 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0feaf858-bacf-4309-9d9d-9a48f59fee88-config-volume\") pod \"collect-profiles-29416125-sr4zr\" (UID: \"0feaf858-bacf-4309-9d9d-9a48f59fee88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.258481 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0feaf858-bacf-4309-9d9d-9a48f59fee88-secret-volume\") pod \"collect-profiles-29416125-sr4zr\" (UID: \"0feaf858-bacf-4309-9d9d-9a48f59fee88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.258518 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hcx5\" (UniqueName: \"kubernetes.io/projected/0feaf858-bacf-4309-9d9d-9a48f59fee88-kube-api-access-6hcx5\") pod \"collect-profiles-29416125-sr4zr\" (UID: \"0feaf858-bacf-4309-9d9d-9a48f59fee88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.359693 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0feaf858-bacf-4309-9d9d-9a48f59fee88-secret-volume\") pod \"collect-profiles-29416125-sr4zr\" (UID: \"0feaf858-bacf-4309-9d9d-9a48f59fee88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.359763 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hcx5\" (UniqueName: \"kubernetes.io/projected/0feaf858-bacf-4309-9d9d-9a48f59fee88-kube-api-access-6hcx5\") pod \"collect-profiles-29416125-sr4zr\" (UID: \"0feaf858-bacf-4309-9d9d-9a48f59fee88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.359924 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0feaf858-bacf-4309-9d9d-9a48f59fee88-config-volume\") pod \"collect-profiles-29416125-sr4zr\" (UID: \"0feaf858-bacf-4309-9d9d-9a48f59fee88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.361020 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0feaf858-bacf-4309-9d9d-9a48f59fee88-config-volume\") pod \"collect-profiles-29416125-sr4zr\" (UID: \"0feaf858-bacf-4309-9d9d-9a48f59fee88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.365197 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0feaf858-bacf-4309-9d9d-9a48f59fee88-secret-volume\") pod \"collect-profiles-29416125-sr4zr\" (UID: \"0feaf858-bacf-4309-9d9d-9a48f59fee88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.399002 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hcx5\" (UniqueName: \"kubernetes.io/projected/0feaf858-bacf-4309-9d9d-9a48f59fee88-kube-api-access-6hcx5\") pod \"collect-profiles-29416125-sr4zr\" (UID: \"0feaf858-bacf-4309-9d9d-9a48f59fee88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.463760 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" Dec 05 20:45:00 crc kubenswrapper[4744]: I1205 20:45:00.932897 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr"] Dec 05 20:45:01 crc kubenswrapper[4744]: I1205 20:45:01.280691 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" event={"ID":"0feaf858-bacf-4309-9d9d-9a48f59fee88","Type":"ContainerStarted","Data":"2ff1cf6040e16c82244b29e0a5b1a2c670e1ef1aa4565929d6a3ef3c7bb83669"} Dec 05 20:45:01 crc kubenswrapper[4744]: I1205 20:45:01.280918 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" event={"ID":"0feaf858-bacf-4309-9d9d-9a48f59fee88","Type":"ContainerStarted","Data":"706c2ab778caa9a64455c0f6030605a85f145f7c2f65acc744a9d92fa560ca24"} Dec 05 20:45:01 crc kubenswrapper[4744]: I1205 20:45:01.302442 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" podStartSLOduration=1.302426128 podStartE2EDuration="1.302426128s" podCreationTimestamp="2025-12-05 20:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 20:45:01.301612538 +0000 UTC m=+2071.531423906" watchObservedRunningTime="2025-12-05 20:45:01.302426128 +0000 UTC m=+2071.532237486" Dec 05 20:45:02 crc kubenswrapper[4744]: I1205 20:45:02.290914 4744 generic.go:334] "Generic (PLEG): container finished" podID="0feaf858-bacf-4309-9d9d-9a48f59fee88" containerID="2ff1cf6040e16c82244b29e0a5b1a2c670e1ef1aa4565929d6a3ef3c7bb83669" exitCode=0 Dec 05 20:45:02 crc kubenswrapper[4744]: I1205 20:45:02.291046 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" event={"ID":"0feaf858-bacf-4309-9d9d-9a48f59fee88","Type":"ContainerDied","Data":"2ff1cf6040e16c82244b29e0a5b1a2c670e1ef1aa4565929d6a3ef3c7bb83669"} Dec 05 20:45:03 crc kubenswrapper[4744]: I1205 20:45:03.570112 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" Dec 05 20:45:03 crc kubenswrapper[4744]: I1205 20:45:03.711008 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0feaf858-bacf-4309-9d9d-9a48f59fee88-config-volume\") pod \"0feaf858-bacf-4309-9d9d-9a48f59fee88\" (UID: \"0feaf858-bacf-4309-9d9d-9a48f59fee88\") " Dec 05 20:45:03 crc kubenswrapper[4744]: I1205 20:45:03.711756 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0feaf858-bacf-4309-9d9d-9a48f59fee88-config-volume" (OuterVolumeSpecName: "config-volume") pod "0feaf858-bacf-4309-9d9d-9a48f59fee88" (UID: "0feaf858-bacf-4309-9d9d-9a48f59fee88"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 20:45:03 crc kubenswrapper[4744]: I1205 20:45:03.711948 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0feaf858-bacf-4309-9d9d-9a48f59fee88-secret-volume\") pod \"0feaf858-bacf-4309-9d9d-9a48f59fee88\" (UID: \"0feaf858-bacf-4309-9d9d-9a48f59fee88\") " Dec 05 20:45:03 crc kubenswrapper[4744]: I1205 20:45:03.712050 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hcx5\" (UniqueName: \"kubernetes.io/projected/0feaf858-bacf-4309-9d9d-9a48f59fee88-kube-api-access-6hcx5\") pod \"0feaf858-bacf-4309-9d9d-9a48f59fee88\" (UID: \"0feaf858-bacf-4309-9d9d-9a48f59fee88\") " Dec 05 20:45:03 crc kubenswrapper[4744]: I1205 20:45:03.713439 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0feaf858-bacf-4309-9d9d-9a48f59fee88-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:03 crc kubenswrapper[4744]: I1205 20:45:03.718206 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0feaf858-bacf-4309-9d9d-9a48f59fee88-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0feaf858-bacf-4309-9d9d-9a48f59fee88" (UID: "0feaf858-bacf-4309-9d9d-9a48f59fee88"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 20:45:03 crc kubenswrapper[4744]: I1205 20:45:03.718236 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0feaf858-bacf-4309-9d9d-9a48f59fee88-kube-api-access-6hcx5" (OuterVolumeSpecName: "kube-api-access-6hcx5") pod "0feaf858-bacf-4309-9d9d-9a48f59fee88" (UID: "0feaf858-bacf-4309-9d9d-9a48f59fee88"). InnerVolumeSpecName "kube-api-access-6hcx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:45:03 crc kubenswrapper[4744]: I1205 20:45:03.814931 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0feaf858-bacf-4309-9d9d-9a48f59fee88-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:03 crc kubenswrapper[4744]: I1205 20:45:03.814969 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hcx5\" (UniqueName: \"kubernetes.io/projected/0feaf858-bacf-4309-9d9d-9a48f59fee88-kube-api-access-6hcx5\") on node \"crc\" DevicePath \"\"" Dec 05 20:45:04 crc kubenswrapper[4744]: I1205 20:45:04.309273 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" event={"ID":"0feaf858-bacf-4309-9d9d-9a48f59fee88","Type":"ContainerDied","Data":"706c2ab778caa9a64455c0f6030605a85f145f7c2f65acc744a9d92fa560ca24"} Dec 05 20:45:04 crc kubenswrapper[4744]: I1205 20:45:04.309547 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="706c2ab778caa9a64455c0f6030605a85f145f7c2f65acc744a9d92fa560ca24" Dec 05 20:45:04 crc kubenswrapper[4744]: I1205 20:45:04.309609 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416125-sr4zr" Dec 05 20:45:04 crc kubenswrapper[4744]: I1205 20:45:04.643972 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c"] Dec 05 20:45:04 crc kubenswrapper[4744]: I1205 20:45:04.650872 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416080-6dh4c"] Dec 05 20:45:06 crc kubenswrapper[4744]: I1205 20:45:06.095992 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="264cec36-f420-4db9-ba83-266f78ecb82d" path="/var/lib/kubelet/pods/264cec36-f420-4db9-ba83-266f78ecb82d/volumes" Dec 05 20:45:14 crc kubenswrapper[4744]: I1205 20:45:14.481734 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-ztkrp_a15804b0-0714-4384-ac0d-917338ef4104/kube-rbac-proxy/0.log" Dec 05 20:45:14 crc kubenswrapper[4744]: I1205 20:45:14.616383 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-ztkrp_a15804b0-0714-4384-ac0d-917338ef4104/controller/0.log" Dec 05 20:45:14 crc kubenswrapper[4744]: I1205 20:45:14.757133 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/cp-frr-files/0.log" Dec 05 20:45:14 crc kubenswrapper[4744]: I1205 20:45:14.900915 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/cp-frr-files/0.log" Dec 05 20:45:14 crc kubenswrapper[4744]: I1205 20:45:14.960578 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/cp-metrics/0.log" Dec 05 20:45:14 crc kubenswrapper[4744]: I1205 20:45:14.995185 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/cp-reloader/0.log" Dec 05 20:45:15 crc kubenswrapper[4744]: I1205 20:45:15.003875 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/cp-reloader/0.log" Dec 05 20:45:15 crc kubenswrapper[4744]: I1205 20:45:15.208837 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/cp-frr-files/0.log" Dec 05 20:45:15 crc kubenswrapper[4744]: I1205 20:45:15.333188 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/cp-metrics/0.log" Dec 05 20:45:15 crc kubenswrapper[4744]: I1205 20:45:15.353674 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/cp-metrics/0.log" Dec 05 20:45:15 crc kubenswrapper[4744]: I1205 20:45:15.494609 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/cp-reloader/0.log" Dec 05 20:45:15 crc kubenswrapper[4744]: I1205 20:45:15.672022 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/cp-frr-files/0.log" Dec 05 20:45:15 crc kubenswrapper[4744]: I1205 20:45:15.678018 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/cp-metrics/0.log" Dec 05 20:45:15 crc kubenswrapper[4744]: I1205 20:45:15.681364 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/controller/0.log" Dec 05 20:45:15 crc kubenswrapper[4744]: I1205 20:45:15.714960 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/cp-reloader/0.log" Dec 05 20:45:15 crc kubenswrapper[4744]: I1205 20:45:15.912950 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/kube-rbac-proxy/0.log" Dec 05 20:45:15 crc kubenswrapper[4744]: I1205 20:45:15.918521 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/frr-metrics/0.log" Dec 05 20:45:16 crc kubenswrapper[4744]: I1205 20:45:16.003439 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/kube-rbac-proxy-frr/0.log" Dec 05 20:45:16 crc kubenswrapper[4744]: I1205 20:45:16.131002 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/reloader/0.log" Dec 05 20:45:16 crc kubenswrapper[4744]: I1205 20:45:16.266098 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-84dtl_52a85b81-11c7-4a3f-9a1d-9ffe9edaa447/frr-k8s-webhook-server/0.log" Dec 05 20:45:16 crc kubenswrapper[4744]: I1205 20:45:16.378287 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-579c6fcd5-kp2mf_c15b5414-dbb3-461e-9108-26f514628d7b/manager/0.log" Dec 05 20:45:16 crc kubenswrapper[4744]: I1205 20:45:16.643105 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-686966fbfb-zrgkc_70f8386c-29b0-4cc8-9d75-740e8796f01a/webhook-server/0.log" Dec 05 20:45:16 crc kubenswrapper[4744]: I1205 20:45:16.754054 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qq6d7_934c31a1-c04b-42d9-be60-cdbf988913eb/kube-rbac-proxy/0.log" Dec 05 20:45:16 crc kubenswrapper[4744]: I1205 20:45:16.820260 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8gfzk_2498f6fb-e1f7-481e-af17-1138c80628ae/frr/0.log" Dec 05 20:45:17 crc kubenswrapper[4744]: I1205 20:45:17.056818 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qq6d7_934c31a1-c04b-42d9-be60-cdbf988913eb/speaker/0.log" Dec 05 20:45:37 crc kubenswrapper[4744]: I1205 20:45:37.448938 4744 scope.go:117] "RemoveContainer" containerID="87d306283b8f8d70c87a7ad7547dac093940d3b3e5a81c886b13f4a91e4be5c9" Dec 05 20:45:37 crc kubenswrapper[4744]: I1205 20:45:37.483059 4744 scope.go:117] "RemoveContainer" containerID="b979aee5296f25ac4ce2be7b962db5441e75c8e7f51bbf152627f541483ca5e2" Dec 05 20:45:37 crc kubenswrapper[4744]: I1205 20:45:37.530114 4744 scope.go:117] "RemoveContainer" containerID="dee98d68f5ccc437abde103b17cfd52ec3494a6186c3c32acfba284506b883b9" Dec 05 20:45:37 crc kubenswrapper[4744]: I1205 20:45:37.605376 4744 scope.go:117] "RemoveContainer" containerID="000acd9f79f8e63fdef9b0ea4b5f7b17fe7ffdfdb291e73f0f2b2611c1d7bb10" Dec 05 20:45:37 crc kubenswrapper[4744]: I1205 20:45:37.640474 4744 scope.go:117] "RemoveContainer" containerID="7e5fe38608d4f8dc724b655ae172f9a982a4cd752fc32dbffee26ec3acf3d871" Dec 05 20:45:37 crc kubenswrapper[4744]: I1205 20:45:37.683971 4744 scope.go:117] "RemoveContainer" containerID="24dcd57e4c27bb2b600d6f5f61855187c6a46378fcbdc25e54daf802874edad6" Dec 05 20:45:37 crc kubenswrapper[4744]: I1205 20:45:37.739775 4744 scope.go:117] "RemoveContainer" containerID="25c313de6b3a623ed38a27355064d85875128657ff6d0697cf4aebd094bb965e" Dec 05 20:45:41 crc kubenswrapper[4744]: I1205 20:45:41.840232 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_483c94a6-6fac-4036-8b54-d22abbf49164/init-config-reloader/0.log" Dec 05 20:45:42 crc kubenswrapper[4744]: I1205 20:45:42.077633 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_483c94a6-6fac-4036-8b54-d22abbf49164/init-config-reloader/0.log" Dec 05 20:45:42 crc kubenswrapper[4744]: I1205 20:45:42.082509 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_483c94a6-6fac-4036-8b54-d22abbf49164/alertmanager/0.log" Dec 05 20:45:42 crc kubenswrapper[4744]: I1205 20:45:42.123245 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_483c94a6-6fac-4036-8b54-d22abbf49164/config-reloader/0.log" Dec 05 20:45:42 crc kubenswrapper[4744]: I1205 20:45:42.311465 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_ab9ab744-e215-4bca-a2d4-53969e490cdb/ceilometer-notification-agent/0.log" Dec 05 20:45:42 crc kubenswrapper[4744]: I1205 20:45:42.334854 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_ab9ab744-e215-4bca-a2d4-53969e490cdb/proxy-httpd/0.log" Dec 05 20:45:42 crc kubenswrapper[4744]: I1205 20:45:42.349458 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_ab9ab744-e215-4bca-a2d4-53969e490cdb/ceilometer-central-agent/0.log" Dec 05 20:45:42 crc kubenswrapper[4744]: I1205 20:45:42.361658 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_ab9ab744-e215-4bca-a2d4-53969e490cdb/sg-core/0.log" Dec 05 20:45:42 crc kubenswrapper[4744]: I1205 20:45:42.545286 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_keystone-bootstrap-8ktjk_b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d/keystone-bootstrap/0.log" Dec 05 20:45:42 crc kubenswrapper[4744]: I1205 20:45:42.597545 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_keystone-544f89f8d4-qfdqn_30b134cc-5016-44bf-9d8a-b26e23b38782/keystone-api/0.log" Dec 05 20:45:42 crc kubenswrapper[4744]: I1205 20:45:42.728500 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_kube-state-metrics-0_1b1577b0-93ce-41d3-9c87-6009a42d525a/kube-state-metrics/0.log" Dec 05 20:45:42 crc kubenswrapper[4744]: I1205 20:45:42.985130 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_a8aefca6-22e2-4e40-9287-3e0fec292264/mysql-bootstrap/0.log" Dec 05 20:45:43 crc kubenswrapper[4744]: I1205 20:45:43.217455 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_a8aefca6-22e2-4e40-9287-3e0fec292264/mysql-bootstrap/0.log" Dec 05 20:45:43 crc kubenswrapper[4744]: I1205 20:45:43.348574 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_a8aefca6-22e2-4e40-9287-3e0fec292264/galera/0.log" Dec 05 20:45:43 crc kubenswrapper[4744]: I1205 20:45:43.782271 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstackclient_b9978ca7-d572-45a0-8b3a-94a3eef5e1b2/openstackclient/0.log" Dec 05 20:45:43 crc kubenswrapper[4744]: I1205 20:45:43.843510 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_be427b22-e361-4be4-8eec-bb2b4be47296/init-config-reloader/0.log" Dec 05 20:45:44 crc kubenswrapper[4744]: I1205 20:45:44.155062 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_be427b22-e361-4be4-8eec-bb2b4be47296/prometheus/0.log" Dec 05 20:45:44 crc kubenswrapper[4744]: I1205 20:45:44.174458 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_be427b22-e361-4be4-8eec-bb2b4be47296/init-config-reloader/0.log" Dec 05 20:45:44 crc kubenswrapper[4744]: I1205 20:45:44.259545 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_be427b22-e361-4be4-8eec-bb2b4be47296/config-reloader/0.log" Dec 05 20:45:44 crc kubenswrapper[4744]: I1205 20:45:44.396240 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_be427b22-e361-4be4-8eec-bb2b4be47296/thanos-sidecar/0.log" Dec 05 20:45:44 crc kubenswrapper[4744]: I1205 20:45:44.554969 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_c4dae229-7a1c-4eb8-8932-7fd75e348bb2/setup-container/0.log" Dec 05 20:45:44 crc kubenswrapper[4744]: I1205 20:45:44.812322 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_c4dae229-7a1c-4eb8-8932-7fd75e348bb2/rabbitmq/0.log" Dec 05 20:45:44 crc kubenswrapper[4744]: I1205 20:45:44.838951 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_c4dae229-7a1c-4eb8-8932-7fd75e348bb2/setup-container/0.log" Dec 05 20:45:45 crc kubenswrapper[4744]: I1205 20:45:45.140400 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_cfb456f7-66c1-4493-85d4-bae3322914f9/setup-container/0.log" Dec 05 20:45:45 crc kubenswrapper[4744]: I1205 20:45:45.269404 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_cfb456f7-66c1-4493-85d4-bae3322914f9/setup-container/0.log" Dec 05 20:45:45 crc kubenswrapper[4744]: I1205 20:45:45.336348 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_cfb456f7-66c1-4493-85d4-bae3322914f9/rabbitmq/0.log" Dec 05 20:45:51 crc kubenswrapper[4744]: I1205 20:45:51.158358 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_memcached-0_21a4ed1e-1e04-482c-a036-dc690da56572/memcached/0.log" Dec 05 20:46:03 crc kubenswrapper[4744]: I1205 20:46:03.379463 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd_c822e5a4-a983-475b-95f4-0557534a89b6/util/0.log" Dec 05 20:46:03 crc kubenswrapper[4744]: I1205 20:46:03.583464 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd_c822e5a4-a983-475b-95f4-0557534a89b6/util/0.log" Dec 05 20:46:03 crc kubenswrapper[4744]: I1205 20:46:03.635764 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd_c822e5a4-a983-475b-95f4-0557534a89b6/pull/0.log" Dec 05 20:46:03 crc kubenswrapper[4744]: I1205 20:46:03.665008 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd_c822e5a4-a983-475b-95f4-0557534a89b6/pull/0.log" Dec 05 20:46:03 crc kubenswrapper[4744]: I1205 20:46:03.777203 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd_c822e5a4-a983-475b-95f4-0557534a89b6/util/0.log" Dec 05 20:46:03 crc kubenswrapper[4744]: I1205 20:46:03.821460 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd_c822e5a4-a983-475b-95f4-0557534a89b6/extract/0.log" Dec 05 20:46:03 crc kubenswrapper[4744]: I1205 20:46:03.849777 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931antzfd_c822e5a4-a983-475b-95f4-0557534a89b6/pull/0.log" Dec 05 20:46:04 crc kubenswrapper[4744]: I1205 20:46:04.000450 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5_a23e281a-f6ab-488a-97f1-e8854dedc3c3/util/0.log" Dec 05 20:46:04 crc kubenswrapper[4744]: I1205 20:46:04.156020 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5_a23e281a-f6ab-488a-97f1-e8854dedc3c3/util/0.log" Dec 05 20:46:04 crc kubenswrapper[4744]: I1205 20:46:04.180345 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5_a23e281a-f6ab-488a-97f1-e8854dedc3c3/pull/0.log" Dec 05 20:46:04 crc kubenswrapper[4744]: I1205 20:46:04.203502 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5_a23e281a-f6ab-488a-97f1-e8854dedc3c3/pull/0.log" Dec 05 20:46:04 crc kubenswrapper[4744]: I1205 20:46:04.398320 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5_a23e281a-f6ab-488a-97f1-e8854dedc3c3/pull/0.log" Dec 05 20:46:04 crc kubenswrapper[4744]: I1205 20:46:04.404897 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5_a23e281a-f6ab-488a-97f1-e8854dedc3c3/extract/0.log" Dec 05 20:46:04 crc kubenswrapper[4744]: I1205 20:46:04.425879 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f74vs5_a23e281a-f6ab-488a-97f1-e8854dedc3c3/util/0.log" Dec 05 20:46:04 crc kubenswrapper[4744]: I1205 20:46:04.593994 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk_64773703-6ddb-4194-b745-6d130565fe68/util/0.log" Dec 05 20:46:04 crc kubenswrapper[4744]: I1205 20:46:04.826097 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk_64773703-6ddb-4194-b745-6d130565fe68/pull/0.log" Dec 05 20:46:04 crc kubenswrapper[4744]: I1205 20:46:04.830417 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk_64773703-6ddb-4194-b745-6d130565fe68/pull/0.log" Dec 05 20:46:04 crc kubenswrapper[4744]: I1205 20:46:04.884596 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk_64773703-6ddb-4194-b745-6d130565fe68/util/0.log" Dec 05 20:46:05 crc kubenswrapper[4744]: I1205 20:46:05.026652 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk_64773703-6ddb-4194-b745-6d130565fe68/util/0.log" Dec 05 20:46:05 crc kubenswrapper[4744]: I1205 20:46:05.069688 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk_64773703-6ddb-4194-b745-6d130565fe68/extract/0.log" Dec 05 20:46:05 crc kubenswrapper[4744]: I1205 20:46:05.078248 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mqbpk_64773703-6ddb-4194-b745-6d130565fe68/pull/0.log" Dec 05 20:46:05 crc kubenswrapper[4744]: I1205 20:46:05.209070 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv_9c2bb82a-ab47-491c-8379-7204ae825090/util/0.log" Dec 05 20:46:05 crc kubenswrapper[4744]: I1205 20:46:05.410963 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv_9c2bb82a-ab47-491c-8379-7204ae825090/pull/0.log" Dec 05 20:46:05 crc kubenswrapper[4744]: I1205 20:46:05.443637 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv_9c2bb82a-ab47-491c-8379-7204ae825090/pull/0.log" Dec 05 20:46:05 crc kubenswrapper[4744]: I1205 20:46:05.466451 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv_9c2bb82a-ab47-491c-8379-7204ae825090/util/0.log" Dec 05 20:46:05 crc kubenswrapper[4744]: I1205 20:46:05.621961 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv_9c2bb82a-ab47-491c-8379-7204ae825090/util/0.log" Dec 05 20:46:05 crc kubenswrapper[4744]: I1205 20:46:05.648067 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv_9c2bb82a-ab47-491c-8379-7204ae825090/pull/0.log" Dec 05 20:46:05 crc kubenswrapper[4744]: I1205 20:46:05.696936 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f8362vqv_9c2bb82a-ab47-491c-8379-7204ae825090/extract/0.log" Dec 05 20:46:05 crc kubenswrapper[4744]: I1205 20:46:05.852774 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vtxf_9d2d3bdb-3fb4-4934-a6a6-5943e734a347/extract-utilities/0.log" Dec 05 20:46:06 crc kubenswrapper[4744]: I1205 20:46:06.080373 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vtxf_9d2d3bdb-3fb4-4934-a6a6-5943e734a347/extract-content/0.log" Dec 05 20:46:06 crc kubenswrapper[4744]: I1205 20:46:06.101900 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vtxf_9d2d3bdb-3fb4-4934-a6a6-5943e734a347/extract-utilities/0.log" Dec 05 20:46:06 crc kubenswrapper[4744]: I1205 20:46:06.123002 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vtxf_9d2d3bdb-3fb4-4934-a6a6-5943e734a347/extract-content/0.log" Dec 05 20:46:06 crc kubenswrapper[4744]: I1205 20:46:06.291935 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vtxf_9d2d3bdb-3fb4-4934-a6a6-5943e734a347/extract-utilities/0.log" Dec 05 20:46:06 crc kubenswrapper[4744]: I1205 20:46:06.340134 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vtxf_9d2d3bdb-3fb4-4934-a6a6-5943e734a347/extract-content/0.log" Dec 05 20:46:06 crc kubenswrapper[4744]: I1205 20:46:06.499566 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5vtxf_9d2d3bdb-3fb4-4934-a6a6-5943e734a347/registry-server/0.log" Dec 05 20:46:06 crc kubenswrapper[4744]: I1205 20:46:06.502782 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nqwbs_fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293/extract-utilities/0.log" Dec 05 20:46:06 crc kubenswrapper[4744]: I1205 20:46:06.734848 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nqwbs_fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293/extract-content/0.log" Dec 05 20:46:06 crc kubenswrapper[4744]: I1205 20:46:06.744799 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nqwbs_fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293/extract-content/0.log" Dec 05 20:46:06 crc kubenswrapper[4744]: I1205 20:46:06.753258 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nqwbs_fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293/extract-utilities/0.log" Dec 05 20:46:06 crc kubenswrapper[4744]: I1205 20:46:06.877969 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nqwbs_fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293/extract-utilities/0.log" Dec 05 20:46:06 crc kubenswrapper[4744]: I1205 20:46:06.938018 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nqwbs_fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293/extract-content/0.log" Dec 05 20:46:07 crc kubenswrapper[4744]: I1205 20:46:07.352192 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nqwbs_fd4f5e2f-4f29-4d8a-ab50-a9fd969fe293/registry-server/0.log" Dec 05 20:46:07 crc kubenswrapper[4744]: I1205 20:46:07.383663 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-p727x_f1fd1d53-3fde-4ef7-be02-f689ce95885b/marketplace-operator/0.log" Dec 05 20:46:07 crc kubenswrapper[4744]: I1205 20:46:07.425761 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8x5wd_f7078c58-7b34-4700-83ab-2b104f662fff/extract-utilities/0.log" Dec 05 20:46:07 crc kubenswrapper[4744]: I1205 20:46:07.599114 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8x5wd_f7078c58-7b34-4700-83ab-2b104f662fff/extract-utilities/0.log" Dec 05 20:46:07 crc kubenswrapper[4744]: I1205 20:46:07.602219 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8x5wd_f7078c58-7b34-4700-83ab-2b104f662fff/extract-content/0.log" Dec 05 20:46:07 crc kubenswrapper[4744]: I1205 20:46:07.649336 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8x5wd_f7078c58-7b34-4700-83ab-2b104f662fff/extract-content/0.log" Dec 05 20:46:07 crc kubenswrapper[4744]: I1205 20:46:07.847583 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8x5wd_f7078c58-7b34-4700-83ab-2b104f662fff/extract-content/0.log" Dec 05 20:46:07 crc kubenswrapper[4744]: I1205 20:46:07.927770 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8x5wd_f7078c58-7b34-4700-83ab-2b104f662fff/extract-utilities/0.log" Dec 05 20:46:07 crc kubenswrapper[4744]: I1205 20:46:07.952332 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8x5wd_f7078c58-7b34-4700-83ab-2b104f662fff/registry-server/0.log" Dec 05 20:46:07 crc kubenswrapper[4744]: I1205 20:46:07.977144 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx8q8_b7611a0a-e6bd-4051-bf2d-b3c28e86d91b/extract-utilities/0.log" Dec 05 20:46:08 crc kubenswrapper[4744]: I1205 20:46:08.172821 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx8q8_b7611a0a-e6bd-4051-bf2d-b3c28e86d91b/extract-content/0.log" Dec 05 20:46:08 crc kubenswrapper[4744]: I1205 20:46:08.174991 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx8q8_b7611a0a-e6bd-4051-bf2d-b3c28e86d91b/extract-utilities/0.log" Dec 05 20:46:08 crc kubenswrapper[4744]: I1205 20:46:08.198829 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx8q8_b7611a0a-e6bd-4051-bf2d-b3c28e86d91b/extract-content/0.log" Dec 05 20:46:08 crc kubenswrapper[4744]: I1205 20:46:08.379703 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx8q8_b7611a0a-e6bd-4051-bf2d-b3c28e86d91b/extract-content/0.log" Dec 05 20:46:08 crc kubenswrapper[4744]: I1205 20:46:08.391103 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx8q8_b7611a0a-e6bd-4051-bf2d-b3c28e86d91b/extract-utilities/0.log" Dec 05 20:46:08 crc kubenswrapper[4744]: I1205 20:46:08.811774 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mx8q8_b7611a0a-e6bd-4051-bf2d-b3c28e86d91b/registry-server/0.log" Dec 05 20:46:19 crc kubenswrapper[4744]: I1205 20:46:19.806506 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:46:19 crc kubenswrapper[4744]: I1205 20:46:19.807051 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:46:22 crc kubenswrapper[4744]: I1205 20:46:22.991501 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-6vnbf_d6cb3e32-cc6f-4091-ae30-5de5790d952c/prometheus-operator/0.log" Dec 05 20:46:23 crc kubenswrapper[4744]: I1205 20:46:23.123064 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-58cc69598f-6rn9n_a755ef85-a445-4cd1-bb7b-4bea0bb7b796/prometheus-operator-admission-webhook/0.log" Dec 05 20:46:23 crc kubenswrapper[4744]: I1205 20:46:23.183284 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-58cc69598f-k8vk8_73c8ea4a-800d-4d94-9732-f81484c43481/prometheus-operator-admission-webhook/0.log" Dec 05 20:46:23 crc kubenswrapper[4744]: I1205 20:46:23.342639 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-nlfb9_f152504a-5f82-434d-904e-b9e1f2e49a5e/operator/0.log" Dec 05 20:46:23 crc kubenswrapper[4744]: I1205 20:46:23.466354 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-pn2kn_498942c9-c035-4d3f-a38b-a05221dc46c3/observability-ui-dashboards/0.log" Dec 05 20:46:23 crc kubenswrapper[4744]: I1205 20:46:23.621691 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-52fcv_db4f5a8a-a57f-4988-8b36-f8926084fce9/perses-operator/0.log" Dec 05 20:46:37 crc kubenswrapper[4744]: I1205 20:46:37.894633 4744 scope.go:117] "RemoveContainer" containerID="1467df7c76cb2807243c1d1ac62c74a3774d7572933e4f11c1fe28e63cf17bc0" Dec 05 20:46:37 crc kubenswrapper[4744]: I1205 20:46:37.917284 4744 scope.go:117] "RemoveContainer" containerID="5d73214a3642252484df6ce52dacebae3c72ebc72fd64b160d8b1c3e2ce84296" Dec 05 20:46:37 crc kubenswrapper[4744]: I1205 20:46:37.933813 4744 scope.go:117] "RemoveContainer" containerID="d751b18d85435dbf5e8cd72e0534fdc77f901a8d0abbc55f803a111f4594df75" Dec 05 20:46:37 crc kubenswrapper[4744]: I1205 20:46:37.970899 4744 scope.go:117] "RemoveContainer" containerID="246b91873d98658315f4cb599ab83bfe269c08f82fa1ccb960a632a65e2f8161" Dec 05 20:46:38 crc kubenswrapper[4744]: I1205 20:46:38.023123 4744 scope.go:117] "RemoveContainer" containerID="01c01d346ca2c807d91471dd8ce7d99511b3eda18615b3868f36f8866a5d34e6" Dec 05 20:46:38 crc kubenswrapper[4744]: I1205 20:46:38.046002 4744 scope.go:117] "RemoveContainer" containerID="f46954bf1567479f870d7f703ec1b67f2fc610ccfbc2abd51cd18993eb1f56e1" Dec 05 20:46:39 crc kubenswrapper[4744]: I1205 20:46:39.058688 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-8ktjk"] Dec 05 20:46:39 crc kubenswrapper[4744]: I1205 20:46:39.069269 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-8ktjk"] Dec 05 20:46:40 crc kubenswrapper[4744]: I1205 20:46:40.098569 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d" path="/var/lib/kubelet/pods/b0d4e75d-d5d0-4f20-b91c-d5eef1acfe9d/volumes" Dec 05 20:46:49 crc kubenswrapper[4744]: I1205 20:46:49.806454 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:46:49 crc kubenswrapper[4744]: I1205 20:46:49.807017 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:47:14 crc kubenswrapper[4744]: I1205 20:47:14.694251 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nwcmc"] Dec 05 20:47:14 crc kubenswrapper[4744]: E1205 20:47:14.695129 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0feaf858-bacf-4309-9d9d-9a48f59fee88" containerName="collect-profiles" Dec 05 20:47:14 crc kubenswrapper[4744]: I1205 20:47:14.695143 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0feaf858-bacf-4309-9d9d-9a48f59fee88" containerName="collect-profiles" Dec 05 20:47:14 crc kubenswrapper[4744]: I1205 20:47:14.695373 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0feaf858-bacf-4309-9d9d-9a48f59fee88" containerName="collect-profiles" Dec 05 20:47:14 crc kubenswrapper[4744]: I1205 20:47:14.696733 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:14 crc kubenswrapper[4744]: I1205 20:47:14.722338 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nwcmc"] Dec 05 20:47:14 crc kubenswrapper[4744]: I1205 20:47:14.742354 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/455d05e8-a043-4437-9b7b-74c09ab84970-catalog-content\") pod \"community-operators-nwcmc\" (UID: \"455d05e8-a043-4437-9b7b-74c09ab84970\") " pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:14 crc kubenswrapper[4744]: I1205 20:47:14.742536 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66rh6\" (UniqueName: \"kubernetes.io/projected/455d05e8-a043-4437-9b7b-74c09ab84970-kube-api-access-66rh6\") pod \"community-operators-nwcmc\" (UID: \"455d05e8-a043-4437-9b7b-74c09ab84970\") " pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:14 crc kubenswrapper[4744]: I1205 20:47:14.742571 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/455d05e8-a043-4437-9b7b-74c09ab84970-utilities\") pod \"community-operators-nwcmc\" (UID: \"455d05e8-a043-4437-9b7b-74c09ab84970\") " pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:14 crc kubenswrapper[4744]: I1205 20:47:14.844319 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66rh6\" (UniqueName: \"kubernetes.io/projected/455d05e8-a043-4437-9b7b-74c09ab84970-kube-api-access-66rh6\") pod \"community-operators-nwcmc\" (UID: \"455d05e8-a043-4437-9b7b-74c09ab84970\") " pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:14 crc kubenswrapper[4744]: I1205 20:47:14.844369 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/455d05e8-a043-4437-9b7b-74c09ab84970-utilities\") pod \"community-operators-nwcmc\" (UID: \"455d05e8-a043-4437-9b7b-74c09ab84970\") " pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:14 crc kubenswrapper[4744]: I1205 20:47:14.844405 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/455d05e8-a043-4437-9b7b-74c09ab84970-catalog-content\") pod \"community-operators-nwcmc\" (UID: \"455d05e8-a043-4437-9b7b-74c09ab84970\") " pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:14 crc kubenswrapper[4744]: I1205 20:47:14.844774 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/455d05e8-a043-4437-9b7b-74c09ab84970-utilities\") pod \"community-operators-nwcmc\" (UID: \"455d05e8-a043-4437-9b7b-74c09ab84970\") " pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:14 crc kubenswrapper[4744]: I1205 20:47:14.844817 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/455d05e8-a043-4437-9b7b-74c09ab84970-catalog-content\") pod \"community-operators-nwcmc\" (UID: \"455d05e8-a043-4437-9b7b-74c09ab84970\") " pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:14 crc kubenswrapper[4744]: I1205 20:47:14.867934 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66rh6\" (UniqueName: \"kubernetes.io/projected/455d05e8-a043-4437-9b7b-74c09ab84970-kube-api-access-66rh6\") pod \"community-operators-nwcmc\" (UID: \"455d05e8-a043-4437-9b7b-74c09ab84970\") " pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:15 crc kubenswrapper[4744]: I1205 20:47:15.019019 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:15 crc kubenswrapper[4744]: I1205 20:47:15.531893 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nwcmc"] Dec 05 20:47:15 crc kubenswrapper[4744]: W1205 20:47:15.535120 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod455d05e8_a043_4437_9b7b_74c09ab84970.slice/crio-c2112deadaf71abbcf453c61d7b8d24dced1b5d735179155b3519b6c510bca98 WatchSource:0}: Error finding container c2112deadaf71abbcf453c61d7b8d24dced1b5d735179155b3519b6c510bca98: Status 404 returned error can't find the container with id c2112deadaf71abbcf453c61d7b8d24dced1b5d735179155b3519b6c510bca98 Dec 05 20:47:16 crc kubenswrapper[4744]: I1205 20:47:16.501169 4744 generic.go:334] "Generic (PLEG): container finished" podID="455d05e8-a043-4437-9b7b-74c09ab84970" containerID="2d3cb0ed04250d42366ed6e1d4fe70e74b74317ea9847f3fdb1638739adc8039" exitCode=0 Dec 05 20:47:16 crc kubenswrapper[4744]: I1205 20:47:16.501254 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwcmc" event={"ID":"455d05e8-a043-4437-9b7b-74c09ab84970","Type":"ContainerDied","Data":"2d3cb0ed04250d42366ed6e1d4fe70e74b74317ea9847f3fdb1638739adc8039"} Dec 05 20:47:16 crc kubenswrapper[4744]: I1205 20:47:16.501320 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwcmc" event={"ID":"455d05e8-a043-4437-9b7b-74c09ab84970","Type":"ContainerStarted","Data":"c2112deadaf71abbcf453c61d7b8d24dced1b5d735179155b3519b6c510bca98"} Dec 05 20:47:17 crc kubenswrapper[4744]: I1205 20:47:17.520365 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwcmc" event={"ID":"455d05e8-a043-4437-9b7b-74c09ab84970","Type":"ContainerStarted","Data":"1937db651a4b1775849ea4c9fd75112d31da383b45ccdd22bb6dee3b904ea0fc"} Dec 05 20:47:18 crc kubenswrapper[4744]: I1205 20:47:18.532230 4744 generic.go:334] "Generic (PLEG): container finished" podID="455d05e8-a043-4437-9b7b-74c09ab84970" containerID="1937db651a4b1775849ea4c9fd75112d31da383b45ccdd22bb6dee3b904ea0fc" exitCode=0 Dec 05 20:47:18 crc kubenswrapper[4744]: I1205 20:47:18.532309 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwcmc" event={"ID":"455d05e8-a043-4437-9b7b-74c09ab84970","Type":"ContainerDied","Data":"1937db651a4b1775849ea4c9fd75112d31da383b45ccdd22bb6dee3b904ea0fc"} Dec 05 20:47:18 crc kubenswrapper[4744]: I1205 20:47:18.532594 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwcmc" event={"ID":"455d05e8-a043-4437-9b7b-74c09ab84970","Type":"ContainerStarted","Data":"01f1b7bd276a6036ce04e712e8575da410b5d1897aa9c216ce6ebb434d8bd36a"} Dec 05 20:47:18 crc kubenswrapper[4744]: I1205 20:47:18.553860 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nwcmc" podStartSLOduration=3.140321953 podStartE2EDuration="4.553844002s" podCreationTimestamp="2025-12-05 20:47:14 +0000 UTC" firstStartedPulling="2025-12-05 20:47:16.503435115 +0000 UTC m=+2206.733246493" lastFinishedPulling="2025-12-05 20:47:17.916957164 +0000 UTC m=+2208.146768542" observedRunningTime="2025-12-05 20:47:18.549435845 +0000 UTC m=+2208.779247213" watchObservedRunningTime="2025-12-05 20:47:18.553844002 +0000 UTC m=+2208.783655370" Dec 05 20:47:19 crc kubenswrapper[4744]: I1205 20:47:19.806548 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:47:19 crc kubenswrapper[4744]: I1205 20:47:19.806979 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:47:19 crc kubenswrapper[4744]: I1205 20:47:19.807028 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:47:19 crc kubenswrapper[4744]: I1205 20:47:19.807752 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"868cd4bd451bd54b5a700cd8156999c8957e86cb450992ed62513f6e758cc078"} pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:47:19 crc kubenswrapper[4744]: I1205 20:47:19.807810 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" containerID="cri-o://868cd4bd451bd54b5a700cd8156999c8957e86cb450992ed62513f6e758cc078" gracePeriod=600 Dec 05 20:47:20 crc kubenswrapper[4744]: I1205 20:47:20.552167 4744 generic.go:334] "Generic (PLEG): container finished" podID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerID="868cd4bd451bd54b5a700cd8156999c8957e86cb450992ed62513f6e758cc078" exitCode=0 Dec 05 20:47:20 crc kubenswrapper[4744]: I1205 20:47:20.552247 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerDied","Data":"868cd4bd451bd54b5a700cd8156999c8957e86cb450992ed62513f6e758cc078"} Dec 05 20:47:20 crc kubenswrapper[4744]: I1205 20:47:20.552446 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerStarted","Data":"e77c9969bc89046f5d4de295f3e3c92046039b17a12043b784a0ba76a197f9f6"} Dec 05 20:47:20 crc kubenswrapper[4744]: I1205 20:47:20.552468 4744 scope.go:117] "RemoveContainer" containerID="0feddadbb34d0bb15f6e44e77b1bc66172f02b344eb60c4a1cefa79789357921" Dec 05 20:47:24 crc kubenswrapper[4744]: I1205 20:47:24.601891 4744 generic.go:334] "Generic (PLEG): container finished" podID="7631d304-47c0-4814-825c-c1c7297c585c" containerID="f64d1209699ec16e3a68ac3ba712b6631cf52a29015e940a6c2007f26702a8ea" exitCode=0 Dec 05 20:47:24 crc kubenswrapper[4744]: I1205 20:47:24.601936 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fb5tg/must-gather-xft2c" event={"ID":"7631d304-47c0-4814-825c-c1c7297c585c","Type":"ContainerDied","Data":"f64d1209699ec16e3a68ac3ba712b6631cf52a29015e940a6c2007f26702a8ea"} Dec 05 20:47:24 crc kubenswrapper[4744]: I1205 20:47:24.603140 4744 scope.go:117] "RemoveContainer" containerID="f64d1209699ec16e3a68ac3ba712b6631cf52a29015e940a6c2007f26702a8ea" Dec 05 20:47:25 crc kubenswrapper[4744]: I1205 20:47:25.020154 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:25 crc kubenswrapper[4744]: I1205 20:47:25.020223 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:25 crc kubenswrapper[4744]: I1205 20:47:25.106996 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:25 crc kubenswrapper[4744]: I1205 20:47:25.285811 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fb5tg_must-gather-xft2c_7631d304-47c0-4814-825c-c1c7297c585c/gather/0.log" Dec 05 20:47:25 crc kubenswrapper[4744]: I1205 20:47:25.659072 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:28 crc kubenswrapper[4744]: I1205 20:47:28.688929 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nwcmc"] Dec 05 20:47:28 crc kubenswrapper[4744]: I1205 20:47:28.689839 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nwcmc" podUID="455d05e8-a043-4437-9b7b-74c09ab84970" containerName="registry-server" containerID="cri-o://01f1b7bd276a6036ce04e712e8575da410b5d1897aa9c216ce6ebb434d8bd36a" gracePeriod=2 Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.206502 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.275087 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66rh6\" (UniqueName: \"kubernetes.io/projected/455d05e8-a043-4437-9b7b-74c09ab84970-kube-api-access-66rh6\") pod \"455d05e8-a043-4437-9b7b-74c09ab84970\" (UID: \"455d05e8-a043-4437-9b7b-74c09ab84970\") " Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.275139 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/455d05e8-a043-4437-9b7b-74c09ab84970-utilities\") pod \"455d05e8-a043-4437-9b7b-74c09ab84970\" (UID: \"455d05e8-a043-4437-9b7b-74c09ab84970\") " Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.275227 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/455d05e8-a043-4437-9b7b-74c09ab84970-catalog-content\") pod \"455d05e8-a043-4437-9b7b-74c09ab84970\" (UID: \"455d05e8-a043-4437-9b7b-74c09ab84970\") " Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.276862 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/455d05e8-a043-4437-9b7b-74c09ab84970-utilities" (OuterVolumeSpecName: "utilities") pod "455d05e8-a043-4437-9b7b-74c09ab84970" (UID: "455d05e8-a043-4437-9b7b-74c09ab84970"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.279601 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/455d05e8-a043-4437-9b7b-74c09ab84970-kube-api-access-66rh6" (OuterVolumeSpecName: "kube-api-access-66rh6") pod "455d05e8-a043-4437-9b7b-74c09ab84970" (UID: "455d05e8-a043-4437-9b7b-74c09ab84970"). InnerVolumeSpecName "kube-api-access-66rh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.328617 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/455d05e8-a043-4437-9b7b-74c09ab84970-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "455d05e8-a043-4437-9b7b-74c09ab84970" (UID: "455d05e8-a043-4437-9b7b-74c09ab84970"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.376407 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/455d05e8-a043-4437-9b7b-74c09ab84970-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.376456 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66rh6\" (UniqueName: \"kubernetes.io/projected/455d05e8-a043-4437-9b7b-74c09ab84970-kube-api-access-66rh6\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.376471 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/455d05e8-a043-4437-9b7b-74c09ab84970-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.667847 4744 generic.go:334] "Generic (PLEG): container finished" podID="455d05e8-a043-4437-9b7b-74c09ab84970" containerID="01f1b7bd276a6036ce04e712e8575da410b5d1897aa9c216ce6ebb434d8bd36a" exitCode=0 Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.667910 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwcmc" event={"ID":"455d05e8-a043-4437-9b7b-74c09ab84970","Type":"ContainerDied","Data":"01f1b7bd276a6036ce04e712e8575da410b5d1897aa9c216ce6ebb434d8bd36a"} Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.667942 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwcmc" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.667977 4744 scope.go:117] "RemoveContainer" containerID="01f1b7bd276a6036ce04e712e8575da410b5d1897aa9c216ce6ebb434d8bd36a" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.667960 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwcmc" event={"ID":"455d05e8-a043-4437-9b7b-74c09ab84970","Type":"ContainerDied","Data":"c2112deadaf71abbcf453c61d7b8d24dced1b5d735179155b3519b6c510bca98"} Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.691616 4744 scope.go:117] "RemoveContainer" containerID="1937db651a4b1775849ea4c9fd75112d31da383b45ccdd22bb6dee3b904ea0fc" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.713923 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nwcmc"] Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.721089 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nwcmc"] Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.730776 4744 scope.go:117] "RemoveContainer" containerID="2d3cb0ed04250d42366ed6e1d4fe70e74b74317ea9847f3fdb1638739adc8039" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.767532 4744 scope.go:117] "RemoveContainer" containerID="01f1b7bd276a6036ce04e712e8575da410b5d1897aa9c216ce6ebb434d8bd36a" Dec 05 20:47:29 crc kubenswrapper[4744]: E1205 20:47:29.767958 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01f1b7bd276a6036ce04e712e8575da410b5d1897aa9c216ce6ebb434d8bd36a\": container with ID starting with 01f1b7bd276a6036ce04e712e8575da410b5d1897aa9c216ce6ebb434d8bd36a not found: ID does not exist" containerID="01f1b7bd276a6036ce04e712e8575da410b5d1897aa9c216ce6ebb434d8bd36a" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.767992 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01f1b7bd276a6036ce04e712e8575da410b5d1897aa9c216ce6ebb434d8bd36a"} err="failed to get container status \"01f1b7bd276a6036ce04e712e8575da410b5d1897aa9c216ce6ebb434d8bd36a\": rpc error: code = NotFound desc = could not find container \"01f1b7bd276a6036ce04e712e8575da410b5d1897aa9c216ce6ebb434d8bd36a\": container with ID starting with 01f1b7bd276a6036ce04e712e8575da410b5d1897aa9c216ce6ebb434d8bd36a not found: ID does not exist" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.768028 4744 scope.go:117] "RemoveContainer" containerID="1937db651a4b1775849ea4c9fd75112d31da383b45ccdd22bb6dee3b904ea0fc" Dec 05 20:47:29 crc kubenswrapper[4744]: E1205 20:47:29.768343 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1937db651a4b1775849ea4c9fd75112d31da383b45ccdd22bb6dee3b904ea0fc\": container with ID starting with 1937db651a4b1775849ea4c9fd75112d31da383b45ccdd22bb6dee3b904ea0fc not found: ID does not exist" containerID="1937db651a4b1775849ea4c9fd75112d31da383b45ccdd22bb6dee3b904ea0fc" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.768393 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1937db651a4b1775849ea4c9fd75112d31da383b45ccdd22bb6dee3b904ea0fc"} err="failed to get container status \"1937db651a4b1775849ea4c9fd75112d31da383b45ccdd22bb6dee3b904ea0fc\": rpc error: code = NotFound desc = could not find container \"1937db651a4b1775849ea4c9fd75112d31da383b45ccdd22bb6dee3b904ea0fc\": container with ID starting with 1937db651a4b1775849ea4c9fd75112d31da383b45ccdd22bb6dee3b904ea0fc not found: ID does not exist" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.768428 4744 scope.go:117] "RemoveContainer" containerID="2d3cb0ed04250d42366ed6e1d4fe70e74b74317ea9847f3fdb1638739adc8039" Dec 05 20:47:29 crc kubenswrapper[4744]: E1205 20:47:29.768843 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d3cb0ed04250d42366ed6e1d4fe70e74b74317ea9847f3fdb1638739adc8039\": container with ID starting with 2d3cb0ed04250d42366ed6e1d4fe70e74b74317ea9847f3fdb1638739adc8039 not found: ID does not exist" containerID="2d3cb0ed04250d42366ed6e1d4fe70e74b74317ea9847f3fdb1638739adc8039" Dec 05 20:47:29 crc kubenswrapper[4744]: I1205 20:47:29.768870 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d3cb0ed04250d42366ed6e1d4fe70e74b74317ea9847f3fdb1638739adc8039"} err="failed to get container status \"2d3cb0ed04250d42366ed6e1d4fe70e74b74317ea9847f3fdb1638739adc8039\": rpc error: code = NotFound desc = could not find container \"2d3cb0ed04250d42366ed6e1d4fe70e74b74317ea9847f3fdb1638739adc8039\": container with ID starting with 2d3cb0ed04250d42366ed6e1d4fe70e74b74317ea9847f3fdb1638739adc8039 not found: ID does not exist" Dec 05 20:47:30 crc kubenswrapper[4744]: I1205 20:47:30.092695 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="455d05e8-a043-4437-9b7b-74c09ab84970" path="/var/lib/kubelet/pods/455d05e8-a043-4437-9b7b-74c09ab84970/volumes" Dec 05 20:47:32 crc kubenswrapper[4744]: I1205 20:47:32.732846 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fb5tg/must-gather-xft2c"] Dec 05 20:47:32 crc kubenswrapper[4744]: I1205 20:47:32.733563 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-fb5tg/must-gather-xft2c" podUID="7631d304-47c0-4814-825c-c1c7297c585c" containerName="copy" containerID="cri-o://aa846b5ebe0a154a1376e841b7539b114804221c20f27495888783f01650e25a" gracePeriod=2 Dec 05 20:47:32 crc kubenswrapper[4744]: I1205 20:47:32.743787 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fb5tg/must-gather-xft2c"] Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.080578 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fb5tg_must-gather-xft2c_7631d304-47c0-4814-825c-c1c7297c585c/copy/0.log" Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.081202 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fb5tg/must-gather-xft2c" Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.234667 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7631d304-47c0-4814-825c-c1c7297c585c-must-gather-output\") pod \"7631d304-47c0-4814-825c-c1c7297c585c\" (UID: \"7631d304-47c0-4814-825c-c1c7297c585c\") " Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.234816 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whkxg\" (UniqueName: \"kubernetes.io/projected/7631d304-47c0-4814-825c-c1c7297c585c-kube-api-access-whkxg\") pod \"7631d304-47c0-4814-825c-c1c7297c585c\" (UID: \"7631d304-47c0-4814-825c-c1c7297c585c\") " Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.244049 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7631d304-47c0-4814-825c-c1c7297c585c-kube-api-access-whkxg" (OuterVolumeSpecName: "kube-api-access-whkxg") pod "7631d304-47c0-4814-825c-c1c7297c585c" (UID: "7631d304-47c0-4814-825c-c1c7297c585c"). InnerVolumeSpecName "kube-api-access-whkxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.332162 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7631d304-47c0-4814-825c-c1c7297c585c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7631d304-47c0-4814-825c-c1c7297c585c" (UID: "7631d304-47c0-4814-825c-c1c7297c585c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.336835 4744 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7631d304-47c0-4814-825c-c1c7297c585c-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.336860 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whkxg\" (UniqueName: \"kubernetes.io/projected/7631d304-47c0-4814-825c-c1c7297c585c-kube-api-access-whkxg\") on node \"crc\" DevicePath \"\"" Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.702240 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fb5tg_must-gather-xft2c_7631d304-47c0-4814-825c-c1c7297c585c/copy/0.log" Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.702906 4744 generic.go:334] "Generic (PLEG): container finished" podID="7631d304-47c0-4814-825c-c1c7297c585c" containerID="aa846b5ebe0a154a1376e841b7539b114804221c20f27495888783f01650e25a" exitCode=143 Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.702957 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fb5tg/must-gather-xft2c" Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.702964 4744 scope.go:117] "RemoveContainer" containerID="aa846b5ebe0a154a1376e841b7539b114804221c20f27495888783f01650e25a" Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.733797 4744 scope.go:117] "RemoveContainer" containerID="f64d1209699ec16e3a68ac3ba712b6631cf52a29015e940a6c2007f26702a8ea" Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.794392 4744 scope.go:117] "RemoveContainer" containerID="aa846b5ebe0a154a1376e841b7539b114804221c20f27495888783f01650e25a" Dec 05 20:47:33 crc kubenswrapper[4744]: E1205 20:47:33.794975 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa846b5ebe0a154a1376e841b7539b114804221c20f27495888783f01650e25a\": container with ID starting with aa846b5ebe0a154a1376e841b7539b114804221c20f27495888783f01650e25a not found: ID does not exist" containerID="aa846b5ebe0a154a1376e841b7539b114804221c20f27495888783f01650e25a" Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.795024 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa846b5ebe0a154a1376e841b7539b114804221c20f27495888783f01650e25a"} err="failed to get container status \"aa846b5ebe0a154a1376e841b7539b114804221c20f27495888783f01650e25a\": rpc error: code = NotFound desc = could not find container \"aa846b5ebe0a154a1376e841b7539b114804221c20f27495888783f01650e25a\": container with ID starting with aa846b5ebe0a154a1376e841b7539b114804221c20f27495888783f01650e25a not found: ID does not exist" Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.795050 4744 scope.go:117] "RemoveContainer" containerID="f64d1209699ec16e3a68ac3ba712b6631cf52a29015e940a6c2007f26702a8ea" Dec 05 20:47:33 crc kubenswrapper[4744]: E1205 20:47:33.795371 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f64d1209699ec16e3a68ac3ba712b6631cf52a29015e940a6c2007f26702a8ea\": container with ID starting with f64d1209699ec16e3a68ac3ba712b6631cf52a29015e940a6c2007f26702a8ea not found: ID does not exist" containerID="f64d1209699ec16e3a68ac3ba712b6631cf52a29015e940a6c2007f26702a8ea" Dec 05 20:47:33 crc kubenswrapper[4744]: I1205 20:47:33.795416 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f64d1209699ec16e3a68ac3ba712b6631cf52a29015e940a6c2007f26702a8ea"} err="failed to get container status \"f64d1209699ec16e3a68ac3ba712b6631cf52a29015e940a6c2007f26702a8ea\": rpc error: code = NotFound desc = could not find container \"f64d1209699ec16e3a68ac3ba712b6631cf52a29015e940a6c2007f26702a8ea\": container with ID starting with f64d1209699ec16e3a68ac3ba712b6631cf52a29015e940a6c2007f26702a8ea not found: ID does not exist" Dec 05 20:47:34 crc kubenswrapper[4744]: I1205 20:47:34.092065 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7631d304-47c0-4814-825c-c1c7297c585c" path="/var/lib/kubelet/pods/7631d304-47c0-4814-825c-c1c7297c585c/volumes" Dec 05 20:47:38 crc kubenswrapper[4744]: I1205 20:47:38.138825 4744 scope.go:117] "RemoveContainer" containerID="d313866aed818743e6b1dcb19321a72744910236a4e9f134b3bf8b2ccbbe0b84" Dec 05 20:47:38 crc kubenswrapper[4744]: I1205 20:47:38.164724 4744 scope.go:117] "RemoveContainer" containerID="fce23b8b5d7310add4cafdcad32bf1ef3f1d089c67eec3779049ace5f03cf9b1" Dec 05 20:47:38 crc kubenswrapper[4744]: I1205 20:47:38.186850 4744 scope.go:117] "RemoveContainer" containerID="ebcbc3dfa8120f968315f6fc6d5a54fef03796bec1c57fda67ae748e42f867fd" Dec 05 20:47:38 crc kubenswrapper[4744]: I1205 20:47:38.205944 4744 scope.go:117] "RemoveContainer" containerID="bf20b69d955132a45cc905c13bf5975c2a90b83b437706c3331f0e6940247a7e" Dec 05 20:47:38 crc kubenswrapper[4744]: I1205 20:47:38.245614 4744 scope.go:117] "RemoveContainer" containerID="50c232af572cf057632a973e1b339c0dc214c5f0f2c0ea3f1145c19d187b9198" Dec 05 20:47:38 crc kubenswrapper[4744]: I1205 20:47:38.279270 4744 scope.go:117] "RemoveContainer" containerID="76f4447081f9e62d788f17c32db1b7f2c28204661cd660c045adbc35ffaaf9b0" Dec 05 20:47:38 crc kubenswrapper[4744]: I1205 20:47:38.299343 4744 scope.go:117] "RemoveContainer" containerID="6e3a2a4acc6ff94bd082aaa2663dd3c2ae1b06ef6291b53db18ca789c6d7b93d" Dec 05 20:47:38 crc kubenswrapper[4744]: I1205 20:47:38.331818 4744 scope.go:117] "RemoveContainer" containerID="c705812d17e73610015cb6aeab2284831a6fb05e3b6b382daae8950465eb9fb3" Dec 05 20:47:38 crc kubenswrapper[4744]: I1205 20:47:38.360222 4744 scope.go:117] "RemoveContainer" containerID="5829861bcdbcca036d9431fd67071926377ab10428bbb9a0386e222665934602" Dec 05 20:47:38 crc kubenswrapper[4744]: I1205 20:47:38.393712 4744 scope.go:117] "RemoveContainer" containerID="e7d5c6d41c77fe170228c097f4eca79210d72cb3a30957655e3b0f11cd2ccfd5" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.495176 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5j849"] Dec 05 20:48:01 crc kubenswrapper[4744]: E1205 20:48:01.496072 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7631d304-47c0-4814-825c-c1c7297c585c" containerName="copy" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.496087 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7631d304-47c0-4814-825c-c1c7297c585c" containerName="copy" Dec 05 20:48:01 crc kubenswrapper[4744]: E1205 20:48:01.496100 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455d05e8-a043-4437-9b7b-74c09ab84970" containerName="extract-content" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.496107 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="455d05e8-a043-4437-9b7b-74c09ab84970" containerName="extract-content" Dec 05 20:48:01 crc kubenswrapper[4744]: E1205 20:48:01.496118 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7631d304-47c0-4814-825c-c1c7297c585c" containerName="gather" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.496125 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7631d304-47c0-4814-825c-c1c7297c585c" containerName="gather" Dec 05 20:48:01 crc kubenswrapper[4744]: E1205 20:48:01.496151 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455d05e8-a043-4437-9b7b-74c09ab84970" containerName="registry-server" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.496159 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="455d05e8-a043-4437-9b7b-74c09ab84970" containerName="registry-server" Dec 05 20:48:01 crc kubenswrapper[4744]: E1205 20:48:01.496183 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455d05e8-a043-4437-9b7b-74c09ab84970" containerName="extract-utilities" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.496190 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="455d05e8-a043-4437-9b7b-74c09ab84970" containerName="extract-utilities" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.496375 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7631d304-47c0-4814-825c-c1c7297c585c" containerName="copy" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.496399 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="455d05e8-a043-4437-9b7b-74c09ab84970" containerName="registry-server" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.496412 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7631d304-47c0-4814-825c-c1c7297c585c" containerName="gather" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.497829 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.512807 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5j849"] Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.668137 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49904e21-be42-4853-b1ec-1dba20b149f2-utilities\") pod \"redhat-marketplace-5j849\" (UID: \"49904e21-be42-4853-b1ec-1dba20b149f2\") " pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.668353 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwtwg\" (UniqueName: \"kubernetes.io/projected/49904e21-be42-4853-b1ec-1dba20b149f2-kube-api-access-dwtwg\") pod \"redhat-marketplace-5j849\" (UID: \"49904e21-be42-4853-b1ec-1dba20b149f2\") " pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.668408 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49904e21-be42-4853-b1ec-1dba20b149f2-catalog-content\") pod \"redhat-marketplace-5j849\" (UID: \"49904e21-be42-4853-b1ec-1dba20b149f2\") " pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.770047 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49904e21-be42-4853-b1ec-1dba20b149f2-utilities\") pod \"redhat-marketplace-5j849\" (UID: \"49904e21-be42-4853-b1ec-1dba20b149f2\") " pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.770194 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwtwg\" (UniqueName: \"kubernetes.io/projected/49904e21-be42-4853-b1ec-1dba20b149f2-kube-api-access-dwtwg\") pod \"redhat-marketplace-5j849\" (UID: \"49904e21-be42-4853-b1ec-1dba20b149f2\") " pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.770221 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49904e21-be42-4853-b1ec-1dba20b149f2-catalog-content\") pod \"redhat-marketplace-5j849\" (UID: \"49904e21-be42-4853-b1ec-1dba20b149f2\") " pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.770657 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49904e21-be42-4853-b1ec-1dba20b149f2-utilities\") pod \"redhat-marketplace-5j849\" (UID: \"49904e21-be42-4853-b1ec-1dba20b149f2\") " pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.770691 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49904e21-be42-4853-b1ec-1dba20b149f2-catalog-content\") pod \"redhat-marketplace-5j849\" (UID: \"49904e21-be42-4853-b1ec-1dba20b149f2\") " pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.791319 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwtwg\" (UniqueName: \"kubernetes.io/projected/49904e21-be42-4853-b1ec-1dba20b149f2-kube-api-access-dwtwg\") pod \"redhat-marketplace-5j849\" (UID: \"49904e21-be42-4853-b1ec-1dba20b149f2\") " pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:01 crc kubenswrapper[4744]: I1205 20:48:01.813352 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:02 crc kubenswrapper[4744]: I1205 20:48:02.281047 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5j849"] Dec 05 20:48:03 crc kubenswrapper[4744]: I1205 20:48:03.012441 4744 generic.go:334] "Generic (PLEG): container finished" podID="49904e21-be42-4853-b1ec-1dba20b149f2" containerID="af8cdabb5fa558e7c1db9dc1752c804c2357590487ad892440c7adac1c235274" exitCode=0 Dec 05 20:48:03 crc kubenswrapper[4744]: I1205 20:48:03.012491 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j849" event={"ID":"49904e21-be42-4853-b1ec-1dba20b149f2","Type":"ContainerDied","Data":"af8cdabb5fa558e7c1db9dc1752c804c2357590487ad892440c7adac1c235274"} Dec 05 20:48:03 crc kubenswrapper[4744]: I1205 20:48:03.012775 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j849" event={"ID":"49904e21-be42-4853-b1ec-1dba20b149f2","Type":"ContainerStarted","Data":"2ed7762e116419838b79c96bf23abcefd08792cba02cd843134be897b815b0d2"} Dec 05 20:48:04 crc kubenswrapper[4744]: I1205 20:48:04.024836 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j849" event={"ID":"49904e21-be42-4853-b1ec-1dba20b149f2","Type":"ContainerStarted","Data":"53ca09b91450702626f5d0c1ec8e372d371e8dbe830b90ff630a8e25e2c15191"} Dec 05 20:48:05 crc kubenswrapper[4744]: I1205 20:48:05.034749 4744 generic.go:334] "Generic (PLEG): container finished" podID="49904e21-be42-4853-b1ec-1dba20b149f2" containerID="53ca09b91450702626f5d0c1ec8e372d371e8dbe830b90ff630a8e25e2c15191" exitCode=0 Dec 05 20:48:05 crc kubenswrapper[4744]: I1205 20:48:05.034796 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j849" event={"ID":"49904e21-be42-4853-b1ec-1dba20b149f2","Type":"ContainerDied","Data":"53ca09b91450702626f5d0c1ec8e372d371e8dbe830b90ff630a8e25e2c15191"} Dec 05 20:48:06 crc kubenswrapper[4744]: I1205 20:48:06.045233 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j849" event={"ID":"49904e21-be42-4853-b1ec-1dba20b149f2","Type":"ContainerStarted","Data":"1171fc8503ecc66dede8d8aa3319dde1a57162ea1cff555f8cfc426a14fcba45"} Dec 05 20:48:06 crc kubenswrapper[4744]: I1205 20:48:06.076161 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5j849" podStartSLOduration=2.642075534 podStartE2EDuration="5.076142643s" podCreationTimestamp="2025-12-05 20:48:01 +0000 UTC" firstStartedPulling="2025-12-05 20:48:03.014854553 +0000 UTC m=+2253.244665921" lastFinishedPulling="2025-12-05 20:48:05.448921662 +0000 UTC m=+2255.678733030" observedRunningTime="2025-12-05 20:48:06.068009635 +0000 UTC m=+2256.297821003" watchObservedRunningTime="2025-12-05 20:48:06.076142643 +0000 UTC m=+2256.305954011" Dec 05 20:48:11 crc kubenswrapper[4744]: I1205 20:48:11.814466 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:11 crc kubenswrapper[4744]: I1205 20:48:11.814853 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:11 crc kubenswrapper[4744]: I1205 20:48:11.876886 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:12 crc kubenswrapper[4744]: I1205 20:48:12.162810 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:15 crc kubenswrapper[4744]: I1205 20:48:15.482646 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5j849"] Dec 05 20:48:15 crc kubenswrapper[4744]: I1205 20:48:15.483184 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5j849" podUID="49904e21-be42-4853-b1ec-1dba20b149f2" containerName="registry-server" containerID="cri-o://1171fc8503ecc66dede8d8aa3319dde1a57162ea1cff555f8cfc426a14fcba45" gracePeriod=2 Dec 05 20:48:16 crc kubenswrapper[4744]: I1205 20:48:16.133455 4744 generic.go:334] "Generic (PLEG): container finished" podID="49904e21-be42-4853-b1ec-1dba20b149f2" containerID="1171fc8503ecc66dede8d8aa3319dde1a57162ea1cff555f8cfc426a14fcba45" exitCode=0 Dec 05 20:48:16 crc kubenswrapper[4744]: I1205 20:48:16.133505 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j849" event={"ID":"49904e21-be42-4853-b1ec-1dba20b149f2","Type":"ContainerDied","Data":"1171fc8503ecc66dede8d8aa3319dde1a57162ea1cff555f8cfc426a14fcba45"} Dec 05 20:48:16 crc kubenswrapper[4744]: I1205 20:48:16.479718 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:16 crc kubenswrapper[4744]: I1205 20:48:16.628022 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwtwg\" (UniqueName: \"kubernetes.io/projected/49904e21-be42-4853-b1ec-1dba20b149f2-kube-api-access-dwtwg\") pod \"49904e21-be42-4853-b1ec-1dba20b149f2\" (UID: \"49904e21-be42-4853-b1ec-1dba20b149f2\") " Dec 05 20:48:16 crc kubenswrapper[4744]: I1205 20:48:16.628120 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49904e21-be42-4853-b1ec-1dba20b149f2-utilities\") pod \"49904e21-be42-4853-b1ec-1dba20b149f2\" (UID: \"49904e21-be42-4853-b1ec-1dba20b149f2\") " Dec 05 20:48:16 crc kubenswrapper[4744]: I1205 20:48:16.628388 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49904e21-be42-4853-b1ec-1dba20b149f2-catalog-content\") pod \"49904e21-be42-4853-b1ec-1dba20b149f2\" (UID: \"49904e21-be42-4853-b1ec-1dba20b149f2\") " Dec 05 20:48:16 crc kubenswrapper[4744]: I1205 20:48:16.629698 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49904e21-be42-4853-b1ec-1dba20b149f2-utilities" (OuterVolumeSpecName: "utilities") pod "49904e21-be42-4853-b1ec-1dba20b149f2" (UID: "49904e21-be42-4853-b1ec-1dba20b149f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:48:16 crc kubenswrapper[4744]: I1205 20:48:16.638580 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49904e21-be42-4853-b1ec-1dba20b149f2-kube-api-access-dwtwg" (OuterVolumeSpecName: "kube-api-access-dwtwg") pod "49904e21-be42-4853-b1ec-1dba20b149f2" (UID: "49904e21-be42-4853-b1ec-1dba20b149f2"). InnerVolumeSpecName "kube-api-access-dwtwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:48:16 crc kubenswrapper[4744]: I1205 20:48:16.670263 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49904e21-be42-4853-b1ec-1dba20b149f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49904e21-be42-4853-b1ec-1dba20b149f2" (UID: "49904e21-be42-4853-b1ec-1dba20b149f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:48:16 crc kubenswrapper[4744]: I1205 20:48:16.730368 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwtwg\" (UniqueName: \"kubernetes.io/projected/49904e21-be42-4853-b1ec-1dba20b149f2-kube-api-access-dwtwg\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:16 crc kubenswrapper[4744]: I1205 20:48:16.730409 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49904e21-be42-4853-b1ec-1dba20b149f2-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:16 crc kubenswrapper[4744]: I1205 20:48:16.730423 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49904e21-be42-4853-b1ec-1dba20b149f2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:48:17 crc kubenswrapper[4744]: I1205 20:48:17.146617 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j849" event={"ID":"49904e21-be42-4853-b1ec-1dba20b149f2","Type":"ContainerDied","Data":"2ed7762e116419838b79c96bf23abcefd08792cba02cd843134be897b815b0d2"} Dec 05 20:48:17 crc kubenswrapper[4744]: I1205 20:48:17.146670 4744 scope.go:117] "RemoveContainer" containerID="1171fc8503ecc66dede8d8aa3319dde1a57162ea1cff555f8cfc426a14fcba45" Dec 05 20:48:17 crc kubenswrapper[4744]: I1205 20:48:17.146691 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5j849" Dec 05 20:48:17 crc kubenswrapper[4744]: I1205 20:48:17.164838 4744 scope.go:117] "RemoveContainer" containerID="53ca09b91450702626f5d0c1ec8e372d371e8dbe830b90ff630a8e25e2c15191" Dec 05 20:48:17 crc kubenswrapper[4744]: I1205 20:48:17.185351 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5j849"] Dec 05 20:48:17 crc kubenswrapper[4744]: I1205 20:48:17.192038 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5j849"] Dec 05 20:48:17 crc kubenswrapper[4744]: I1205 20:48:17.218195 4744 scope.go:117] "RemoveContainer" containerID="af8cdabb5fa558e7c1db9dc1752c804c2357590487ad892440c7adac1c235274" Dec 05 20:48:18 crc kubenswrapper[4744]: I1205 20:48:18.099156 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49904e21-be42-4853-b1ec-1dba20b149f2" path="/var/lib/kubelet/pods/49904e21-be42-4853-b1ec-1dba20b149f2/volumes" Dec 05 20:48:38 crc kubenswrapper[4744]: I1205 20:48:38.539382 4744 scope.go:117] "RemoveContainer" containerID="d3fd0d82f53b413bbfdc2775eb3d6c85012690f65e6bc189f19d4f34073feebf" Dec 05 20:48:38 crc kubenswrapper[4744]: I1205 20:48:38.565902 4744 scope.go:117] "RemoveContainer" containerID="7bc978c9bd198c9a2b39b98805aad2f918174954b5fbbd71fff48bb3d4491596" Dec 05 20:48:38 crc kubenswrapper[4744]: I1205 20:48:38.610763 4744 scope.go:117] "RemoveContainer" containerID="128094aea8fa20b96b19fe855c1d1f1c226683188f571a036f4c0baf84ecee45" Dec 05 20:48:38 crc kubenswrapper[4744]: I1205 20:48:38.627756 4744 scope.go:117] "RemoveContainer" containerID="29e9e868e49fe0e6de7e312d86270b80a88b7c3bc533f798955d0ede1d0f5a82" Dec 05 20:48:38 crc kubenswrapper[4744]: I1205 20:48:38.675649 4744 scope.go:117] "RemoveContainer" containerID="ca968dd8c18f20e765a8e2b95db30864932ab1e30c4b8227429b63a9ad992989" Dec 05 20:49:49 crc kubenswrapper[4744]: I1205 20:49:49.807197 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:49:49 crc kubenswrapper[4744]: I1205 20:49:49.808483 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:50:19 crc kubenswrapper[4744]: I1205 20:50:19.806950 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:50:19 crc kubenswrapper[4744]: I1205 20:50:19.807930 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:50:49 crc kubenswrapper[4744]: I1205 20:50:49.807097 4744 patch_prober.go:28] interesting pod/machine-config-daemon-bkhvd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 20:50:49 crc kubenswrapper[4744]: I1205 20:50:49.807589 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 20:50:49 crc kubenswrapper[4744]: I1205 20:50:49.807638 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" Dec 05 20:50:49 crc kubenswrapper[4744]: I1205 20:50:49.808245 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e77c9969bc89046f5d4de295f3e3c92046039b17a12043b784a0ba76a197f9f6"} pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 20:50:49 crc kubenswrapper[4744]: I1205 20:50:49.808319 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerName="machine-config-daemon" containerID="cri-o://e77c9969bc89046f5d4de295f3e3c92046039b17a12043b784a0ba76a197f9f6" gracePeriod=600 Dec 05 20:50:49 crc kubenswrapper[4744]: E1205 20:50:49.941372 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:50:50 crc kubenswrapper[4744]: I1205 20:50:50.641463 4744 generic.go:334] "Generic (PLEG): container finished" podID="e25986a8-4343-4c98-bc53-6c1b077661f9" containerID="e77c9969bc89046f5d4de295f3e3c92046039b17a12043b784a0ba76a197f9f6" exitCode=0 Dec 05 20:50:50 crc kubenswrapper[4744]: I1205 20:50:50.641786 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" event={"ID":"e25986a8-4343-4c98-bc53-6c1b077661f9","Type":"ContainerDied","Data":"e77c9969bc89046f5d4de295f3e3c92046039b17a12043b784a0ba76a197f9f6"} Dec 05 20:50:50 crc kubenswrapper[4744]: I1205 20:50:50.641822 4744 scope.go:117] "RemoveContainer" containerID="868cd4bd451bd54b5a700cd8156999c8957e86cb450992ed62513f6e758cc078" Dec 05 20:50:50 crc kubenswrapper[4744]: I1205 20:50:50.642673 4744 scope.go:117] "RemoveContainer" containerID="e77c9969bc89046f5d4de295f3e3c92046039b17a12043b784a0ba76a197f9f6" Dec 05 20:50:50 crc kubenswrapper[4744]: E1205 20:50:50.643052 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.091258 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lpdwr"] Dec 05 20:50:53 crc kubenswrapper[4744]: E1205 20:50:53.091951 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49904e21-be42-4853-b1ec-1dba20b149f2" containerName="extract-utilities" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.091962 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="49904e21-be42-4853-b1ec-1dba20b149f2" containerName="extract-utilities" Dec 05 20:50:53 crc kubenswrapper[4744]: E1205 20:50:53.091995 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49904e21-be42-4853-b1ec-1dba20b149f2" containerName="registry-server" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.092001 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="49904e21-be42-4853-b1ec-1dba20b149f2" containerName="registry-server" Dec 05 20:50:53 crc kubenswrapper[4744]: E1205 20:50:53.092012 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49904e21-be42-4853-b1ec-1dba20b149f2" containerName="extract-content" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.092018 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="49904e21-be42-4853-b1ec-1dba20b149f2" containerName="extract-content" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.092208 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="49904e21-be42-4853-b1ec-1dba20b149f2" containerName="registry-server" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.093255 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.111510 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lpdwr"] Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.263853 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e507059-c112-4811-9ccd-793a5dab26f9-utilities\") pod \"certified-operators-lpdwr\" (UID: \"3e507059-c112-4811-9ccd-793a5dab26f9\") " pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.264088 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c86f\" (UniqueName: \"kubernetes.io/projected/3e507059-c112-4811-9ccd-793a5dab26f9-kube-api-access-6c86f\") pod \"certified-operators-lpdwr\" (UID: \"3e507059-c112-4811-9ccd-793a5dab26f9\") " pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.264263 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e507059-c112-4811-9ccd-793a5dab26f9-catalog-content\") pod \"certified-operators-lpdwr\" (UID: \"3e507059-c112-4811-9ccd-793a5dab26f9\") " pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.365602 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c86f\" (UniqueName: \"kubernetes.io/projected/3e507059-c112-4811-9ccd-793a5dab26f9-kube-api-access-6c86f\") pod \"certified-operators-lpdwr\" (UID: \"3e507059-c112-4811-9ccd-793a5dab26f9\") " pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.365699 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e507059-c112-4811-9ccd-793a5dab26f9-catalog-content\") pod \"certified-operators-lpdwr\" (UID: \"3e507059-c112-4811-9ccd-793a5dab26f9\") " pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.365766 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e507059-c112-4811-9ccd-793a5dab26f9-utilities\") pod \"certified-operators-lpdwr\" (UID: \"3e507059-c112-4811-9ccd-793a5dab26f9\") " pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.366223 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e507059-c112-4811-9ccd-793a5dab26f9-catalog-content\") pod \"certified-operators-lpdwr\" (UID: \"3e507059-c112-4811-9ccd-793a5dab26f9\") " pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.366340 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e507059-c112-4811-9ccd-793a5dab26f9-utilities\") pod \"certified-operators-lpdwr\" (UID: \"3e507059-c112-4811-9ccd-793a5dab26f9\") " pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.386519 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c86f\" (UniqueName: \"kubernetes.io/projected/3e507059-c112-4811-9ccd-793a5dab26f9-kube-api-access-6c86f\") pod \"certified-operators-lpdwr\" (UID: \"3e507059-c112-4811-9ccd-793a5dab26f9\") " pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.411046 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:50:53 crc kubenswrapper[4744]: I1205 20:50:53.995081 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lpdwr"] Dec 05 20:50:54 crc kubenswrapper[4744]: I1205 20:50:54.679805 4744 generic.go:334] "Generic (PLEG): container finished" podID="3e507059-c112-4811-9ccd-793a5dab26f9" containerID="963c2786d113fbb6f680689f610b0e0c2f9c8ed66a4759bb5c04b0058707c032" exitCode=0 Dec 05 20:50:54 crc kubenswrapper[4744]: I1205 20:50:54.679902 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpdwr" event={"ID":"3e507059-c112-4811-9ccd-793a5dab26f9","Type":"ContainerDied","Data":"963c2786d113fbb6f680689f610b0e0c2f9c8ed66a4759bb5c04b0058707c032"} Dec 05 20:50:54 crc kubenswrapper[4744]: I1205 20:50:54.680086 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpdwr" event={"ID":"3e507059-c112-4811-9ccd-793a5dab26f9","Type":"ContainerStarted","Data":"9194b602c1c83c3347fe62b1e61a38889847da4d78912e036c827b16acda3ca4"} Dec 05 20:50:54 crc kubenswrapper[4744]: I1205 20:50:54.683921 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 20:50:55 crc kubenswrapper[4744]: I1205 20:50:55.690335 4744 generic.go:334] "Generic (PLEG): container finished" podID="3e507059-c112-4811-9ccd-793a5dab26f9" containerID="e7b920ea7e4056d83403bb6912c0b4bcfc2078f6b7f52f2ec0fd2a81f3063a14" exitCode=0 Dec 05 20:50:55 crc kubenswrapper[4744]: I1205 20:50:55.690405 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpdwr" event={"ID":"3e507059-c112-4811-9ccd-793a5dab26f9","Type":"ContainerDied","Data":"e7b920ea7e4056d83403bb6912c0b4bcfc2078f6b7f52f2ec0fd2a81f3063a14"} Dec 05 20:50:56 crc kubenswrapper[4744]: I1205 20:50:56.703613 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpdwr" event={"ID":"3e507059-c112-4811-9ccd-793a5dab26f9","Type":"ContainerStarted","Data":"fe4fc45bc54001bc3136347aba0a837478bbe1280d5530bac0feaa80e774bf4d"} Dec 05 20:50:56 crc kubenswrapper[4744]: I1205 20:50:56.723133 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lpdwr" podStartSLOduration=2.302954636 podStartE2EDuration="3.723116077s" podCreationTimestamp="2025-12-05 20:50:53 +0000 UTC" firstStartedPulling="2025-12-05 20:50:54.683488261 +0000 UTC m=+2424.913299669" lastFinishedPulling="2025-12-05 20:50:56.103649702 +0000 UTC m=+2426.333461110" observedRunningTime="2025-12-05 20:50:56.721574079 +0000 UTC m=+2426.951385447" watchObservedRunningTime="2025-12-05 20:50:56.723116077 +0000 UTC m=+2426.952927445" Dec 05 20:51:03 crc kubenswrapper[4744]: I1205 20:51:03.080869 4744 scope.go:117] "RemoveContainer" containerID="e77c9969bc89046f5d4de295f3e3c92046039b17a12043b784a0ba76a197f9f6" Dec 05 20:51:03 crc kubenswrapper[4744]: E1205 20:51:03.081934 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9" Dec 05 20:51:03 crc kubenswrapper[4744]: I1205 20:51:03.412113 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:51:03 crc kubenswrapper[4744]: I1205 20:51:03.412197 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:51:03 crc kubenswrapper[4744]: I1205 20:51:03.487958 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:51:03 crc kubenswrapper[4744]: I1205 20:51:03.833976 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:51:07 crc kubenswrapper[4744]: I1205 20:51:07.084063 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lpdwr"] Dec 05 20:51:07 crc kubenswrapper[4744]: I1205 20:51:07.084533 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lpdwr" podUID="3e507059-c112-4811-9ccd-793a5dab26f9" containerName="registry-server" containerID="cri-o://fe4fc45bc54001bc3136347aba0a837478bbe1280d5530bac0feaa80e774bf4d" gracePeriod=2 Dec 05 20:51:07 crc kubenswrapper[4744]: I1205 20:51:07.817008 4744 generic.go:334] "Generic (PLEG): container finished" podID="3e507059-c112-4811-9ccd-793a5dab26f9" containerID="fe4fc45bc54001bc3136347aba0a837478bbe1280d5530bac0feaa80e774bf4d" exitCode=0 Dec 05 20:51:07 crc kubenswrapper[4744]: I1205 20:51:07.818553 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpdwr" event={"ID":"3e507059-c112-4811-9ccd-793a5dab26f9","Type":"ContainerDied","Data":"fe4fc45bc54001bc3136347aba0a837478bbe1280d5530bac0feaa80e774bf4d"} Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.005224 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.123580 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e507059-c112-4811-9ccd-793a5dab26f9-catalog-content\") pod \"3e507059-c112-4811-9ccd-793a5dab26f9\" (UID: \"3e507059-c112-4811-9ccd-793a5dab26f9\") " Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.123726 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e507059-c112-4811-9ccd-793a5dab26f9-utilities\") pod \"3e507059-c112-4811-9ccd-793a5dab26f9\" (UID: \"3e507059-c112-4811-9ccd-793a5dab26f9\") " Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.123789 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c86f\" (UniqueName: \"kubernetes.io/projected/3e507059-c112-4811-9ccd-793a5dab26f9-kube-api-access-6c86f\") pod \"3e507059-c112-4811-9ccd-793a5dab26f9\" (UID: \"3e507059-c112-4811-9ccd-793a5dab26f9\") " Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.125501 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e507059-c112-4811-9ccd-793a5dab26f9-utilities" (OuterVolumeSpecName: "utilities") pod "3e507059-c112-4811-9ccd-793a5dab26f9" (UID: "3e507059-c112-4811-9ccd-793a5dab26f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.137134 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e507059-c112-4811-9ccd-793a5dab26f9-kube-api-access-6c86f" (OuterVolumeSpecName: "kube-api-access-6c86f") pod "3e507059-c112-4811-9ccd-793a5dab26f9" (UID: "3e507059-c112-4811-9ccd-793a5dab26f9"). InnerVolumeSpecName "kube-api-access-6c86f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.176776 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e507059-c112-4811-9ccd-793a5dab26f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e507059-c112-4811-9ccd-793a5dab26f9" (UID: "3e507059-c112-4811-9ccd-793a5dab26f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.226267 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e507059-c112-4811-9ccd-793a5dab26f9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.226317 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e507059-c112-4811-9ccd-793a5dab26f9-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.226327 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c86f\" (UniqueName: \"kubernetes.io/projected/3e507059-c112-4811-9ccd-793a5dab26f9-kube-api-access-6c86f\") on node \"crc\" DevicePath \"\"" Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.831433 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpdwr" event={"ID":"3e507059-c112-4811-9ccd-793a5dab26f9","Type":"ContainerDied","Data":"9194b602c1c83c3347fe62b1e61a38889847da4d78912e036c827b16acda3ca4"} Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.831504 4744 scope.go:117] "RemoveContainer" containerID="fe4fc45bc54001bc3136347aba0a837478bbe1280d5530bac0feaa80e774bf4d" Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.831639 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lpdwr" Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.859714 4744 scope.go:117] "RemoveContainer" containerID="e7b920ea7e4056d83403bb6912c0b4bcfc2078f6b7f52f2ec0fd2a81f3063a14" Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.882285 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lpdwr"] Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.888728 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lpdwr"] Dec 05 20:51:08 crc kubenswrapper[4744]: I1205 20:51:08.910029 4744 scope.go:117] "RemoveContainer" containerID="963c2786d113fbb6f680689f610b0e0c2f9c8ed66a4759bb5c04b0058707c032" Dec 05 20:51:10 crc kubenswrapper[4744]: I1205 20:51:10.096418 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e507059-c112-4811-9ccd-793a5dab26f9" path="/var/lib/kubelet/pods/3e507059-c112-4811-9ccd-793a5dab26f9/volumes" Dec 05 20:51:16 crc kubenswrapper[4744]: I1205 20:51:16.081320 4744 scope.go:117] "RemoveContainer" containerID="e77c9969bc89046f5d4de295f3e3c92046039b17a12043b784a0ba76a197f9f6" Dec 05 20:51:16 crc kubenswrapper[4744]: E1205 20:51:16.082091 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkhvd_openshift-machine-config-operator(e25986a8-4343-4c98-bc53-6c1b077661f9)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkhvd" podUID="e25986a8-4343-4c98-bc53-6c1b077661f9"